all AI news
AdvNF: Reducing Mode Collapse in Conditional Normalising Flows using Adversarial Learning
April 12, 2024, 4:43 a.m. | Vikas Kanaujia, Mathias S. Scheurer, Vipul Arora
cs.LG updates on arXiv.org arxiv.org
Abstract: Deep generative models complement Markov-chain-Monte-Carlo methods for efficiently sampling from high-dimensional distributions. Among these methods, explicit generators, such as Normalising Flows (NFs), in combination with the Metropolis Hastings algorithm have been extensively applied to get unbiased samples from target distributions. We systematically study central problems in conditional NFs, such as high variance, mode collapse and data efficiency. We propose adversarial training for NFs to ameliorate these problems. Experiments are conducted with low-dimensional synthetic datasets and …
abstract adversarial adversarial learning algorithm arxiv combination cond-mat.stat-mech cs.lg deep generative models generative generative models generators markov markov-chain-monte-carlo metropolis monte-carlo nfs physics.comp-ph samples sampling study type unbiased
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Machine Learning Research Scientist
@ d-Matrix | San Diego, Ca