March 5, 2024, 2:52 p.m. | S\'eamus Lankford, Haithem Afli, Andy Way

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.01985v1 Announce Type: new
Abstract: The Transformer model is the state-of-the-art in Machine Translation. However, in general, neural translation models often under perform on language pairs with insufficient training data. As a consequence, relatively few experiments have been carried out using this architecture on low-resource language pairs. In this study, hyperparameter optimization of Transformer models in translating the low-resource English-Irish language pair is evaluated. We demonstrate that choosing appropriate parameters leads to considerable performance improvements. Most importantly, the correct choice …

abstract architecture art arxiv cs.ai cs.cl data general hyperparameter language languages low machine machine translation optimization state study training training data transformer transformer model transformers translation type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Intelligence Manager

@ Sanofi | Budapest

Principal Engineer, Data (Hybrid)

@ Homebase | Toronto, Ontario, Canada