May 15, 2023, 12:46 a.m. | Pengzhi Gao, Liwen Zhang, Zhongjun He, Hua Wu, Haifeng Wang

cs.CL updates on arXiv.org arxiv.org

The multilingual neural machine translation (NMT) model has a promising
capability of zero-shot translation, where it could directly translate between
language pairs unseen during training. For good transfer performance from
supervised directions to zero-shot directions, the multilingual NMT model is
expected to learn universal representations across different languages. This
paper introduces a cross-lingual consistency regularization, CrossConST, to
bridge the representation gap among different languages and boost zero-shot
translation performance. The theoretical analysis shows that CrossConST
implicitly maximizes the probability distribution …

arxiv cross-lingual good language learn machine machine translation multilingual neural machine translation performance regularization training transfer translate translation

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Machine Learning Engineer

@ Samsara | Canada - Remote