all AI news
Accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient. (arXiv:2206.01209v2 [math.OC] UPDATED)
June 27, 2022, 1:11 a.m. | Zhaosong Lu, Sanyou Mei
stat.ML updates on arXiv.org arxiv.org
In this paper we develop accelerated first-order methods for convex
optimization with locally Lipschitz continuous gradient (LLCG), which is beyond
the well-studied class of convex optimization with Lipschitz continuous
gradient. In particular, we first consider unconstrained convex optimization
with LLCG and propose accelerated proximal gradient (APG) methods for solving
it. The proposed APG methods are equipped with a verifiable termination
criterion and enjoy an operation complexity of ${\cal O}(\varepsilon^{-1/2}\log
\varepsilon^{-1})$ and ${\cal O}(\log \varepsilon^{-1})$ for finding an
$\varepsilon$-residual solution of …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Analytics Engineer
@ CircleCI | Remote (US), Remote (Canada), San Francisco, Denver
Bilingual Executive Assistant/Data Analyst - (French and English) - Export
@ Dangote Group | Lagos, Lagos, Nigeria
Workday Services Data Lead
@ WPP | Mexico City, Mexico
Business Data Analyst
@ Nordea | Tallinn, EE, 11415
Data Integrity Lead
@ BioNTech SE | Gaithersburg, MD, US, MD 20878