all AI news
Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions. (arXiv:2006.08167v2 [math.OC] UPDATED)
Web: http://arxiv.org/abs/2006.08167
Jan. 28, 2022, 2:11 a.m. | Tesi Xiao, Krishnakumar Balasubramanian, Saeed Ghadimi
cs.LG updates on arXiv.org arxiv.org
We analyze stochastic conditional gradient methods for constrained
optimization problems arising in over-parametrized machine learning. We show
that one could leverage the interpolation-like conditions satisfied by such
models to obtain improved oracle complexities. Specifically, when the objective
function is convex, we show that the conditional gradient method requires
$\mathcal{O}(\epsilon^{-2})$ calls to the stochastic gradient oracle to find an
$\epsilon$-optimal solution. Furthermore, by including a gradient sliding step,
we show that the number of calls reduces to $\mathcal{O}(\epsilon^{-1.5})$.
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data Architect – Public Sector Health Data Architect, WWPS
@ Amazon.com | US, VA, Virtual Location - Virginia
[Job 8224] Data Engineer - Developer Senior
@ CI&T | Brazil
Software Engineer, Machine Learning, Planner/Behavior Prediction
@ Nuro, Inc. | Mountain View, California (HQ)
Lead Data Scientist
@ Inspectorio | Ho Chi Minh City, Ho Chi Minh City, Vietnam - Remote
Data Engineer
@ Craftable | Portugal - Remote
Sr. Data Scientist, Ads Marketplace Analytics
@ Reddit | Remote - United States