all AI news
MaGNET: Uniform Sampling from Deep Generative Network Manifolds Without Retraining. (arXiv:2110.08009v3 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2110.08009
Jan. 24, 2022, 2:11 a.m. | Ahmed Imtiaz Humayun, Randall Balestriero, Richard Baraniuk
cs.LG updates on arXiv.org arxiv.org
Deep Generative Networks (DGNs) are extensively employed in Generative
Adversarial Networks (GANs), Variational Autoencoders (VAEs), and their
variants to approximate the data manifold and distribution. However, training
samples are often distributed in a non-uniform fashion on the manifold, due to
costs or convenience of collection. For example, the CelebA dataset contains a
large fraction of smiling faces. These inconsistencies will be reproduced when
sampling from the trained DGN, which is not always preferred, e.g., for
fairness or data augmentation. In …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Director, Data Science (Advocacy & Nonprofit)
@ Civis Analytics | Remote
Data Engineer
@ Rappi | [CO] Bogotá
Data Scientist V, Marketplaces Personalization (Remote)
@ ID.me | United States (U.S.)
Product OPs Data Analyst (Flex/Remote)
@ Scaleway | Paris
Big Data Engineer
@ Risk Focus | Riga, Riga, Latvia
Internship Program: Machine Learning Backend
@ Nextail | Remote job