March 26, 2024, 4:51 a.m. | Shaoxiong Ji, Timothee Mickus, Vincent Segonne, J\"org Tiedemann

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.16777v1 Announce Type: new
Abstract: Multilingual pretraining and fine-tuning have remarkably succeeded in various natural language processing tasks. Transferring representations from one language to another is especially crucial for cross-lingual learning. One can expect machine translation objectives to be well suited to fostering such capabilities, as they involve the explicit alignment of semantically equivalent sentences from different languages. This paper investigates the potential benefits of employing machine translation as a continued training objective to enhance language representation learning, bridging multilingual …

abstract arxiv bridge capabilities cross-lingual cs.cl expect fine-tuning language language processing machine machine translation multilingual natural natural language natural language processing pretraining processing tasks transfer transfer learning translation type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US