Web: http://arxiv.org/abs/2205.02160

May 5, 2022, 1:10 a.m. | Yair Carmon, Oliver Hinder

stat.ML updates on arXiv.org arxiv.org

We develop an algorithm for parameter-free stochastic convex optimization
(SCO) whose rate of convergence is only a double-logarithmic factor larger than
the optimal rate for the corresponding known-parameter setting. In contrast,
the best previously known rates for parameter-free SCO are based on online
parameter-free regret bounds, which contain unavoidable excess logarithmic
terms compared to their known-parameter counterparts. Our algorithm is
conceptually simple, has high-probability guarantees, and is also partially
adaptive to unknown gradient norms, smoothness, and strong convexity. At the …

arxiv free making math

More from arxiv.org / stat.ML updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California