Web: http://arxiv.org/abs/2206.11533

June 24, 2022, 1:10 a.m. | Fabio V. Difonzo, Vyacheslav Kungurtsev, Jakub Marecek

cs.LG updates on arXiv.org arxiv.org

Stochastic differential equations of Langevin-diffusion form have received
significant recent, thanks to their foundational role in both Bayesian sampling
algorithms and optimization in machine learning. In the latter, they serve as a
conceptual model of the stochastic gradient flow in training over-parametrized
models. However, the literature typically assumes smoothness of the potential,
whose gradient is the drift term. Nevertheless, there are many problems, for
which the potential function is not continuously differentiable, and hence the
drift is not Lipschitz-continuous everywhere. …

applications arxiv learning machine machine learning math stochastic

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY