all AI news
Recurrence Boosts Diversity! Revisiting Recurrent Latent Variable in Transformer-Based Variational AutoEncoder for Diverse Text Generation. (arXiv:2210.12409v3 [cs.CL] UPDATED)
cs.CL updates on arXiv.org arxiv.org
Variational Auto-Encoder (VAE) has been widely adopted in text generation.
Among many variants, recurrent VAE learns token-wise latent variables with each
conditioned on the preceding ones, which captures sequential variability better
in the era of RNN. However, it is unclear how to incorporate such recurrent
dynamics into the recently dominant Transformer due to its parallelism. In this
work, we propose TRACE, a Transformer-based recurrent VAE structure. TRACE
imposes recurrence on segment-wise latent variables with arbitrarily separated
text segments and constructs …
arxiv autoencoder diverse diversity text text generation transformer