all AI news
Improving Variational Autoencoder Estimation from Incomplete Data with Mixture Variational Families
March 6, 2024, 5:42 a.m. | Vaidotas Simkus, Michael U. Gutmann
cs.LG updates on arXiv.org arxiv.org
Abstract: We consider the task of estimating variational autoencoders (VAEs) when the training data is incomplete. We show that missing data increases the complexity of the model's posterior distribution over the latent variables compared to the fully-observed case. The increased complexity may adversely affect the fit of the model due to a mismatch between the variational and model posterior distributions. We introduce two strategies based on (i) finite variational-mixture and (ii) imputation-based variational-mixture distributions to address …
abstract arxiv autoencoder autoencoders case complexity cs.lg data distribution families incomplete data posterior show stat.ml training training data type variables variational autoencoders
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. BI Analyst
@ AkzoNobel | Pune, IN