all AI news
Statistical inference with implicit SGD: proximal Robbins-Monro vs. Polyak-Ruppert. (arXiv:2206.12663v2 [stat.ML] UPDATED)
June 29, 2022, 1:11 a.m. | Yoonhyung Lee, Sungdong Lee, Joong-Ho Won
stat.ML updates on arXiv.org arxiv.org
The implicit stochastic gradient descent (ISGD), a proximal version of SGD,
is gaining interest in the literature due to its stability over (explicit) SGD.
In this paper, we conduct an in-depth analysis of the two modes of ISGD for
smooth convex functions, namely proximal Robbins-Monro (proxRM) and proximal
Poylak-Ruppert (proxPR) procedures, for their use in statistical inference on
model parameters. Specifically, we derive non-asymptotic point estimation error
bounds of both proxRM and proxPR iterates and their limiting distributions, and
propose …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
IT Commercial Data Analyst - ESO
@ National Grid | Warwick, GB, CV34 6DA
Stagiaire Data Analyst – Banque Privée - Juillet 2024
@ Rothschild & Co | Paris (Messine-29)
Operations Research Scientist I - Network Optimization Focus
@ CSX | Jacksonville, FL, United States
Machine Learning Operations Engineer
@ Intellectsoft | Baku, Baku, Azerbaijan - Remote
Data Analyst
@ Health Care Service Corporation | Richardson Texas HQ (1001 E. Lookout Drive)