all AI news
Shared Latent Space by Both Languages in Non-Autoregressive Neural Machine Translation. (arXiv:2305.03511v1 [cs.CL])
cs.CL updates on arXiv.org arxiv.org
Latent variable modeling in non-autoregressive neural machine translation
(NAT) is a promising approach to mitigate the multimodality problem. In the
previous works, they added an auxiliary model to estimate the posterior
distribution of the latent variable conditioned on the source and target
sentences. However, it causes several disadvantages, such as redundant
information extraction in the latent variable, increasing parameters, and a
tendency to ignore a part of the information from the inputs. In this paper, we
propose a new latent …
arxiv distribution languages machine machine translation modeling multimodality neural machine translation posterior space translation