Feb. 20, 2024, 5:52 a.m. | Nadezhda Chirkova, Vassilina Nikoulina

cs.CL updates on arXiv.org arxiv.org

arXiv:2402.12279v1 Announce Type: new
Abstract: Zero-shot cross-lingual generation implies finetuning of the multilingual pretrained language model on a generation task in one language and then using it to make predictions for this task in other languages. Previous works notice a frequent problem of generation in a wrong language and propose approaches to address it, usually using mT5 as a backbone model. In this work we compare various approaches proposed from the literature in unified settings, also including alternative backbone models, …

abstract arxiv cross-lingual cs.ai cs.cl finetuning generative key knowledge language language model languages multilingual predictions pretrained language model tasks transfer type zero-shot

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120