Web: http://arxiv.org/abs/2205.02273

May 6, 2022, 1:11 a.m. | Binghui Xie, Chenhan Jin, Kaiwen Zhou, James Cheng, Wei Meng

cs.LG updates on arXiv.org arxiv.org

Stochastic variance reduced methods have shown strong performance in solving
finite-sum problems. However, these methods usually require the users to
manually tune the step-size, which is time-consuming or even infeasible for
some large-scale optimization tasks. To overcome the problem, we propose and
analyze several novel adaptive variants of the popular SAGA algorithm.
Eventually, we design a variant of Barzilai-Borwein step-size which is tailored
for the incremental gradient method to ensure memory efficiency and fast
convergence. We establish its convergence guarantees …

arxiv gradient incremental math support

More from arxiv.org / cs.LG updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC