all AI news
Neural Network Weights Do Not Converge to Stationary Points: An Invariant Measure Perspective. (arXiv:2110.06256v2 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2110.06256
June 20, 2022, 1:11 a.m. | Jingzhao Zhang, Haochuan Li, Suvrit Sra, Ali Jadbabaie
cs.LG updates on arXiv.org arxiv.org
This work examines the deep disconnect between existing theoretical analyses
of gradient-based algorithms and the practice of training deep neural networks.
Specifically, we provide numerical evidence that in large-scale neural network
training (e.g., ImageNet + ResNet101, and WT103 + TransformerXL models), the
neural network's weights do not converge to stationary points where the
gradient of the loss is zero. Remarkably, however, we observe that even though
the weights do not converge to stationary points, the progress in minimizing
the loss …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY