all AI news
Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?
March 26, 2024, 4:51 a.m. | Shaoxiong Ji, Timothee Mickus, Vincent Segonne, J\"org Tiedemann
cs.CL updates on arXiv.org arxiv.org
Abstract: Multilingual pretraining and fine-tuning have remarkably succeeded in various natural language processing tasks. Transferring representations from one language to another is especially crucial for cross-lingual learning. One can expect machine translation objectives to be well suited to fostering such capabilities, as they involve the explicit alignment of semantically equivalent sentences from different languages. This paper investigates the potential benefits of employing machine translation as a continued training objective to enhance language representation learning, bridging multilingual …
abstract arxiv bridge capabilities cross-lingual cs.cl expect fine-tuning language language processing machine machine translation multilingual natural natural language natural language processing pretraining processing tasks transfer transfer learning translation type
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US