all AI news
Anticipation-Free Training for Simultaneous Machine Translation. (arXiv:2201.12868v2 [cs.CL] UPDATED)
May 5, 2022, 1:11 a.m. | Chih-Chiang Chang, Shun-Po Chuang, Hung-yi Lee
cs.CL updates on arXiv.org arxiv.org
Simultaneous machine translation (SimulMT) speeds up the translation process
by starting to translate before the source sentence is completely available. It
is difficult due to limited context and word order difference between
languages. Existing methods increase latency or introduce adaptive read-write
policies for SimulMT models to handle local reordering and improve translation
quality. However, the long-distance reordering would make the SimulMT models
learn translation mistakenly. Specifically, the model may be forced to predict
target tokens when the corresponding source tokens …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Principal Machine Learning Engineer (AI, NLP, LLM, Generative AI)
@ Palo Alto Networks | Santa Clara, CA, United States
Consultant Senior Data Engineer F/H
@ Devoteam | Nantes, France