all AI news
DA-Net: A Disentangled and Adaptive Network for Multi-Source Cross-Lingual Transfer Learning
March 8, 2024, 5:47 a.m. | Ling Ge, Chunming Hu, Guanghui Ma, Jihong Liu, Hong Zhang
cs.CL updates on arXiv.org arxiv.org
Abstract: Multi-Source cross-lingual transfer learning deals with the transfer of task knowledge from multiple labelled source languages to an unlabeled target language under the language shift. Existing methods typically focus on weighting the predictions produced by language-specific classifiers of different sources that follow a shared encoder. However, all source languages share the same encoder, which is updated by all these languages. The extracted representations inevitably contain different source languages' information, which may disturb the learning of …
abstract arxiv classifiers cross-lingual cs.ai cs.cl deals focus knowledge language languages multiple network predictions shift transfer transfer learning type
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US