all AI news
Efficient implementation of incremental proximal-point methods. (arXiv:2205.01457v1 [cs.LG])
Web: http://arxiv.org/abs/2205.01457
May 4, 2022, 1:11 a.m. | Alex Shtoff
cs.LG updates on arXiv.org arxiv.org
Model training algorithms which observe a small portion of the training set
in each computational step are ubiquitous in practical machine learning, and
include both stochastic and online optimization methods. In the vast majority
of cases, such algorithms typically observe the training samples via the
gradients of the cost functions the samples incur. Thus, these methods exploit
are the \emph{slope} of the cost functions via their first-order
approximations.
To address limitations of gradient-based methods, such as sensitivity to
step-size choice …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data & Insights Strategy & Innovation General Manager
@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX
Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis
@ Ahmedabad University | Ahmedabad, India
Director, Applied Mathematics & Computational Research Division
@ Lawrence Berkeley National Lab | Berkeley, Ca
Business Data Analyst
@ MainStreet Family Care | Birmingham, AL
Assistant/Associate Professor of the Practice in Business Analytics
@ Georgetown University McDonough School of Business | Washington DC
Senior Data Science Writer
@ NannyML | Remote