all AI news
On Uniform Boundedness Properties of SGD and its Momentum Variants. (arXiv:2201.10245v2 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2201.10245
June 23, 2022, 1:11 a.m. | Xiaoyu Wang, Mikael Johansson
cs.LG updates on arXiv.org arxiv.org
A theoretical, and potentially also practical, problem with stochastic
gradient descent is that trajectories may escape to infinity. In this note, we
investigate uniform boundedness properties of iterates and function values
along the trajectories of the stochastic gradient descent algorithm and its
important momentum variant. Under smoothness and $R$-dissipativity of the loss
function, we show that broad families of step-sizes, including the widely used
step-decay and cosine with (or without) restart step-sizes, result in uniformly
bounded iterates and function values. …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY