all AI news
High-dimensional limit theorems for SGD: Effective dynamics and critical scaling. (arXiv:2206.04030v2 [stat.ML] UPDATED)
Nov. 24, 2022, 7:14 a.m. | Gerard Ben Arous, Reza Gheissari, Aukosh Jagannath
stat.ML updates on arXiv.org arxiv.org
We study the scaling limits of stochastic gradient descent (SGD) with
constant step-size in the high-dimensional regime. We prove limit theorems for
the trajectories of summary statistics (i.e., finite-dimensional functions) of
SGD as the dimension goes to infinity. Our approach allows one to choose the
summary statistics that are tracked, the initialization, and the step-size. It
yields both ballistic (ODE) and diffusive (SDE) limits, with the limit
depending dramatically on the former choices. We show a critical scaling regime
for …
More from arxiv.org / stat.ML updates on arXiv.org
Estimation Sample Complexity of a Class of Nonlinear Continuous-time Systems
2 days, 10 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior AI & Data Engineer
@ Bertelsmann | Kuala Lumpur, 14, MY, 50400
Analytics Engineer
@ Reverse Tech | Philippines - Remote