April 23, 2024, 4:50 a.m. | Nadezhda Chirkova, Sheng Liang, Vassilina Nikoulina

cs.CL updates on arXiv.org arxiv.org

arXiv:2310.09917v3 Announce Type: replace
Abstract: Zero-shot cross-lingual knowledge transfer enables the multilingual pretrained language model (mPLM), finetuned on a task in one language, make predictions for this task in other languages. While being broadly studied for natural language understanding tasks, the described setting is understudied for generation. Previous works notice a frequent problem of generation in a wrong language and propose approaches to address it, usually using mT5 as a backbone model. In this work, we test alternative mPLMs, such …

abstract arxiv cross-lingual cs.cl knowledge language language model language models languages language understanding multilingual natural natural language predictions pretrained language model study tasks transfer type understanding zero-shot

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)

@ takealot.com | Cape Town