all AI news
Accelerated Algorithms for Monotone Inclusion and Constrained Nonconvex-Nonconcave Min-Max Optimization. (arXiv:2206.05248v2 [math.OC] UPDATED)
Aug. 11, 2022, 1:11 a.m. | Yang Cai, Argyris Oikonomou, Weiqiang Zheng
cs.LG updates on arXiv.org arxiv.org
We study monotone inclusions and monotone variational inequalities, as well
as their generalizations to non-monotone settings. We first show that the Extra
Anchored Gradient (EAG) algorithm, originally proposed by Yoon and Ryu [2021]
for unconstrained convex-concave min-max optimization, can be applied to solve
the more general problem of Lipschitz monotone inclusion. More specifically, we
prove that the EAG solves Lipschitz monotone inclusion problems with an
accelerated convergence rate of $O(\frac{1}{T})$, which is optimal among all
first-order methods [Diakonikolas, 2020, Yoon …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Software Engineer, Data Platforms
@ Whatnot | San Francisco, CA, Los Angeles, CA, New York City, Phoenix, AZ, Seattle, WA, Denver, CO
Staff Data Engineer, Data Platform
@ Lilt | Indianapolis
Business Data Analyst - New Division
@ Breakthru Beverage Group | Toronto, ON, Canada
Data Operations Associate
@ iCapital | New York City, United States
Senior Data Scientist, R&D
@ Plusgrade | Toronto, Ontario