all AI news
Conservative SPDEs as fluctuating mean field limits of stochastic gradient descent. (arXiv:2207.05705v1 [math.PR])
July 13, 2022, 1:11 a.m. | Benjamin Gess, Rishabh S. Gvalani, Vitalii Konarovskyi
stat.ML updates on arXiv.org arxiv.org
The convergence of stochastic interacting particle systems in the mean-field
limit to solutions to conservative stochastic partial differential equations is
shown, with optimal rate of convergence. As a second main result, a
quantitative central limit theorem for such SPDEs is derived, again with
optimal rate of convergence.
The results apply in particular to the convergence in the mean-field scaling
of stochastic gradient descent dynamics in overparametrized, shallow neural
networks to solutions to SPDEs. It is shown that the inclusion of …
More from arxiv.org / stat.ML updates on arXiv.org
Estimation Sample Complexity of a Class of Nonlinear Continuous-time Systems
3 days, 19 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Director, Clinical Data Science
@ Aura | Remote USA
Research Scientist, AI (PhD)
@ Meta | Menlo Park, CA | New York City