all AI news
On the Stability of Gradient Descent for Large Learning Rate
Feb. 21, 2024, 5:42 a.m. | Alexandru Cr\u{a}ciun, Debarghya Ghoshdastidar
cs.LG updates on arXiv.org arxiv.org
Abstract: There currently is a significant interest in understanding the Edge of Stability (EoS) phenomenon, which has been observed in neural networks training, characterized by a non-monotonic decrease of the loss function over epochs, while the sharpness of the loss (spectral norm of the Hessian) progressively approaches and stabilizes around 2/(learning rate). Reasons for the existence of EoS when training using gradient descent have recently been proposed -- a lack of flat minima near the gradient …
abstract arxiv cs.lg edge function gradient loss networks neural networks norm rate stability the edge training type understanding
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. BI Analyst
@ AkzoNobel | Pune, IN