March 20, 2024, 4:48 a.m. | Yueen Ma, Dafeng Chi, Jingjing Li, Kai Song, Yuzheng Zhuang, Irwin King

cs.CL updates on arXiv.org arxiv.org

arXiv:2307.00852v2 Announce Type: replace
Abstract: The natural language generation domain has witnessed great success thanks to Transformer models. Although they have achieved state-of-the-art generative quality, they often neglect generative diversity. Prior attempts to tackle this issue suffer from either low model capacity or over-complicated architectures. Some recent methods employ the VAE framework to enhance diversity, but their latent variables fully depend on the input context, restricting exploration of the latent space. In this paper, we introduce VOLTA, a framework that …

abstract architectures art arxiv autoencoder capacity cs.cl diversity domain generative information issue language language generation low natural natural language natural language generation prior quality state success transformer transformer models type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

C003549 Data Analyst (NS) - MON 13 May

@ EMW, Inc. | Braine-l'Alleud, Wallonia, Belgium

Marketing Decision Scientist

@ Meta | Menlo Park, CA | New York City