all AI news
Tight Last-Iterate Convergence of the Extragradient and the Optimistic Gradient Descent-Ascent Algorithm for Constrained Monotone Variational Inequalities. (arXiv:2204.09228v2 [math.OC] UPDATED)
May 11, 2022, 1:12 a.m. | Yang Cai, Argyris Oikonomou, Weiqiang Zheng
cs.LG updates on arXiv.org arxiv.org
The monotone variational inequality is a central problem in mathematical
programming that unifies and generalizes many important settings such as smooth
convex optimization, two-player zero-sum games, convex-concave saddle point
problems, etc. The extragradient algorithm by Korpelevich [1976] and the
optimistic gradient descent-ascent algorithm by Popov [1980] are arguably the
two most classical and popular methods for solving monotone variational
inequalities. Despite its long history, the following major problem remains
open. What is the last-iterate convergence rate of the extragradient algorithm …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote