all AI news
Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?
March 26, 2024, 4:51 a.m. | Shaoxiong Ji, Timothee Mickus, Vincent Segonne, J\"org Tiedemann
cs.CL updates on arXiv.org arxiv.org
Abstract: Multilingual pretraining and fine-tuning have remarkably succeeded in various natural language processing tasks. Transferring representations from one language to another is especially crucial for cross-lingual learning. One can expect machine translation objectives to be well suited to fostering such capabilities, as they involve the explicit alignment of semantically equivalent sentences from different languages. This paper investigates the potential benefits of employing machine translation as a continued training objective to enhance language representation learning, bridging multilingual …
abstract arxiv bridge capabilities cross-lingual cs.cl expect fine-tuning language language processing machine machine translation multilingual natural natural language natural language processing pretraining processing tasks transfer transfer learning translation type
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Software Engineer, Data Tools - Full Stack
@ DoorDash | Pune, India
Senior Data Analyst
@ Artsy | New York City