March 26, 2024, 4:51 a.m. | Shaoxiong Ji, Timothee Mickus, Vincent Segonne, J\"org Tiedemann

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.16777v1 Announce Type: new
Abstract: Multilingual pretraining and fine-tuning have remarkably succeeded in various natural language processing tasks. Transferring representations from one language to another is especially crucial for cross-lingual learning. One can expect machine translation objectives to be well suited to fostering such capabilities, as they involve the explicit alignment of semantically equivalent sentences from different languages. This paper investigates the potential benefits of employing machine translation as a continued training objective to enhance language representation learning, bridging multilingual …

abstract arxiv bridge capabilities cross-lingual cs.cl expect fine-tuning language language processing machine machine translation multilingual natural natural language natural language processing pretraining processing tasks transfer transfer learning translation type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineer, Data Tools - Full Stack

@ DoorDash | Pune, India

Senior Data Analyst

@ Artsy | New York City