April 16, 2024, 4:45 a.m. | Boshko Koloski, Bla\v{z} \v{S}krlj, Marko Robnik-\v{S}ikonja, Senja Pollak

cs.LG updates on arXiv.org arxiv.org

arXiv:2309.06089v2 Announce Type: replace-cross
Abstract: The cross-lingual transfer is a promising technique to solve tasks in less-resourced languages. In this empirical study, we compare two fine-tuning approaches combined with zero-shot and full-shot learning approaches for large language models in a cross-lingual setting. As fine-tuning strategies, we compare parameter-efficient adapter methods with fine-tuning of all parameters. As cross-lingual transfer strategies, we compare the intermediate-training (\textit{IT}) that uses each language sequentially and cross-lingual validation (\textit{CLV}) that uses a target language already in …

abstract adapter arxiv catastrophic forgetting cross-lingual cs.cl cs.lg fine-tuning language language models languages large language large language models measuring solve strategies study tasks transfer type zero-shot

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US