March 8, 2024, 5:47 a.m. | Ling Ge, Chunming Hu, Guanghui Ma, Jihong Liu, Hong Zhang

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.04158v1 Announce Type: new
Abstract: Multi-Source cross-lingual transfer learning deals with the transfer of task knowledge from multiple labelled source languages to an unlabeled target language under the language shift. Existing methods typically focus on weighting the predictions produced by language-specific classifiers of different sources that follow a shared encoder. However, all source languages share the same encoder, which is updated by all these languages. The extracted representations inevitably contain different source languages' information, which may disturb the learning of …

abstract arxiv classifiers cross-lingual cs.ai cs.cl deals focus knowledge language languages multiple network predictions shift transfer transfer learning type

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote