all AI news
Improving Zero-shot Multilingual Neural Machine Translation by Leveraging Cross-lingual Consistency Regularization. (arXiv:2305.07310v1 [cs.CL])
cs.CL updates on arXiv.org arxiv.org
The multilingual neural machine translation (NMT) model has a promising
capability of zero-shot translation, where it could directly translate between
language pairs unseen during training. For good transfer performance from
supervised directions to zero-shot directions, the multilingual NMT model is
expected to learn universal representations across different languages. This
paper introduces a cross-lingual consistency regularization, CrossConST, to
bridge the representation gap among different languages and boost zero-shot
translation performance. The theoretical analysis shows that CrossConST
implicitly maximizes the probability distribution …
arxiv cross-lingual good language learn machine machine translation multilingual neural machine translation performance regularization training transfer translate translation