Web: http://arxiv.org/abs/2111.10510

Jan. 27, 2022, 2:11 a.m. | Francisco Vargas, Andrius Ovsianas, David Fernandes, Mark Girolami, Neil D. Lawrence, Nikolas Nüsken

cs.LG updates on arXiv.org arxiv.org

In this work we explore a new framework for approximate Bayesian inference in
large datasets based on stochastic control. We advocate stochastic control as a
finite time and low variance alternative to popular steady-state methods such
as stochastic gradient Langevin dynamics (SGLD). Furthermore, we discuss and
adapt the existing theoretical guarantees of this framework and establish
connections to already existing VI routines in SDE-based models.

arxiv bayesian learning ml neural

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Product Manager (Europe, Remote)

@ FreshBooks | Germany

Field Operations and Data Engineer, ADAS

@ Lucid Motors | Newark, CA

Machine Learning Engineer - Senior

@ Novetta | Reston, VA

Analytics Engineer

@ ThirdLove | Remote

Senior Machine Learning Infrastructure Engineer - Safety

@ Discord | San Francisco, CA or Remote

Internship, Data Scientist

@ Everstream Analytics | United States (Remote)