June 9, 2022, 1:11 a.m. | Gerard Ben Arous, Reza Gheissari, Aukosh Jagannath

stat.ML updates on arXiv.org arxiv.org

We study the scaling limits of stochastic gradient descent (SGD) with
constant step-size in the high-dimensional regime. We prove limit theorems for
the trajectories of summary statistics (i.e., finite-dimensional functions) of
SGD as the dimension goes to infinity. Our approach allows one to choose the
summary statistics that are tracked, the initialization, and the step-size. It
yields both ballistic (ODE) and diffusive (SDE) limits, with the limit
depending dramatically on the former choices. Interestingly, we find a critical
scaling regime …

arxiv dynamics ml scaling

Senior Data Engineer

@ Publicis Groupe | New York City, United States

Associate Principal Robotics Engineer - Research.

@ Dyson | United Kingdom - Hullavington Office

Duales Studium mit vertiefter Praxis: Bachelor of Science Künstliche Intelligenz und Data Science (m/w/d)

@ Gerresheimer | Wackersdorf, Germany

AI/ML Engineer (TS/SCI) {S}

@ ARKA Group, LP | Aurora, Colorado, United States

Data Integration Engineer

@ Find.co | Sliema

Data Engineer

@ Q2 | Bengaluru, India