Nov. 4, 2022, 1:12 a.m. | Simon Damm, Dennis Forster, Dmytro Velychko, Zhenwen Dai, Asja Fischer, Jörg Lücke

cs.LG updates on arXiv.org arxiv.org

The central objective function of a variational autoencoder (VAE) is its
variational lower bound (the ELBO). Here we show that for standard (i.e.,
Gaussian) VAEs the ELBO converges to a value given by the sum of three
entropies: the (negative) entropy of the prior distribution, the expected
(negative) entropy of the observable distribution, and the average entropy of
the variational distributions (the latter is already part of the ELBO). Our
derived analytical results are exact and apply for small as …

arxiv evidence variational autoencoders

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Machine Learning Engineer - Sr. Consultant level

@ Visa | Bellevue, WA, United States