all AI news
Adaptive Cross-lingual Text Classification through In-Context One-Shot Demonstrations
April 4, 2024, 4:47 a.m. | Emilio Villa-Cueva, A. Pastor L\'opez-Monroy, Fernando S\'anchez-Vega, Thamar Solorio
cs.CL updates on arXiv.org arxiv.org
Abstract: Zero-Shot Cross-lingual Transfer (ZS-XLT) utilizes a model trained in a source language to make predictions in another language, often with a performance loss. To alleviate this, additional improvements can be achieved through subsequent adaptation using examples in the target language. In this paper, we exploit In-Context Tuning (ICT) for One-Shot Cross-lingual transfer in the classification task by introducing In-Context Cross-lingual Transfer (IC-XLT). The novel concept involves training a model to learn from context examples and …
abstract arxiv classification context cross-lingual cs.cl examples exploit improvements language loss paper performance predictions text text classification through transfer type zero-shot
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US