all AI news
Statistical inference with implicit SGD: proximal Robbins-Monro vs. Polyak-Ruppert. (arXiv:2206.12663v2 [stat.ML] UPDATED)
June 29, 2022, 1:11 a.m. | Yoonhyung Lee, Sungdong Lee, Joong-Ho Won
cs.LG updates on arXiv.org arxiv.org
The implicit stochastic gradient descent (ISGD), a proximal version of SGD,
is gaining interest in the literature due to its stability over (explicit) SGD.
In this paper, we conduct an in-depth analysis of the two modes of ISGD for
smooth convex functions, namely proximal Robbins-Monro (proxRM) and proximal
Poylak-Ruppert (proxPR) procedures, for their use in statistical inference on
model parameters. Specifically, we derive non-asymptotic point estimation error
bounds of both proxRM and proxPR iterates and their limiting distributions, and
propose …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Applied Scientist, Control Stack, AWS Center for Quantum Computing
@ Amazon.com | Pasadena, California, USA
Specialist Marketing with focus on ADAS/AD f/m/d
@ AVL | Graz, AT
Machine Learning Engineer, PhD Intern
@ Instacart | United States - Remote
Supervisor, Breast Imaging, Prostate Center, Ultrasound
@ University Health Network | Toronto, ON, Canada
Senior Manager of Data Science (Recommendation Science)
@ NBCUniversal | New York, NEW YORK, United States