Jan. 21, 2022, 2:10 a.m. | Zhuoyuan Mao, Chenhui Chu, Sadao Kurohashi

cs.CL updates on arXiv.org arxiv.org

In the present study, we propose novel sequence-to-sequence pre-training
objectives for low-resource machine translation (NMT): Japanese-specific
sequence to sequence (JASS) for language pairs involving Japanese as the source
or target language, and English-specific sequence to sequence (ENSS) for
language pairs involving English. JASS focuses on masking and reordering
Japanese linguistic units known as bunsetsu, whereas ENSS is proposed based on
phrase structure masking and reordering tasks. Experiments on ASPEC
Japanese--English & Japanese--Chinese, Wikipedia Japanese--Chinese, News
English--Korean corpora demonstrate that JASS …

arxiv machine machine translation neural machine translation training translation

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

IT Data Engineer

@ Procter & Gamble | BUCHAREST OFFICE

Data Engineer (w/m/d)

@ IONOS | Deutschland - Remote

Staff Data Science Engineer, SMAI

@ Micron Technology | Hyderabad - Phoenix Aquila, India

Academically & Intellectually Gifted Teacher (AIG - Elementary)

@ Wake County Public School System | Cary, NC, United States