Web: http://arxiv.org/abs/2206.07553

June 16, 2022, 1:11 a.m. | Raghu Bollapragada, Tyler Chen, Rachel Ward

cs.LG updates on arXiv.org arxiv.org

Simple stochastic momentum methods are widely used in machine learning
optimization, but their good practical performance is at odds with an absence
of theoretical guarantees of acceleration in the literature. In this work, we
aim to close the gap between theory and practice by showing that stochastic
heavy ball momentum, which can be interpreted as a randomized Kaczmarz
algorithm with momentum, retains the fast linear rate of (deterministic) heavy
ball momentum on quadratic optimization problems, at least when minibatching
with …

arxiv convergence lg on

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY