Web: http://arxiv.org/abs/2201.10989

Jan. 27, 2022, 2:11 a.m. | Pierre-Alexandre Mattei, Jes Frellsen

cs.LG updates on arXiv.org arxiv.org

We revisit the theory of importance weighted variational inference (IWVI), a
promising strategy for learning latent variable models. IWVI uses new
variational bounds, known as Monte Carlo objectives (MCOs), obtained by
replacing intractable integrals by Monte Carlo estimates -- usually simply
obtained via importance sampling. Burda, Grosse and Salakhutdinov (2016) showed
that increasing the number of importance samples provably tightens the gap
between the bound and the likelihood. Inspired by this simple monotonicity
theorem, we present a series of nonasymptotic …

arxiv ml

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Product Manager (Europe, Remote)

@ FreshBooks | Germany

Field Operations and Data Engineer, ADAS

@ Lucid Motors | Newark, CA

Machine Learning Engineer - Senior

@ Novetta | Reston, VA

Analytics Engineer

@ ThirdLove | Remote

Senior Machine Learning Infrastructure Engineer - Safety

@ Discord | San Francisco, CA or Remote

Internship, Data Scientist

@ Everstream Analytics | United States (Remote)