all AI news
Tight Last-Iterate Convergence of the Extragradient and the Optimistic Gradient Descent-Ascent Algorithm for Constrained Monotone Variational Inequalities. (arXiv:2204.09228v2 [math.OC] UPDATED)
Web: http://arxiv.org/abs/2204.09228
May 11, 2022, 1:12 a.m. | Yang Cai, Argyris Oikonomou, Weiqiang Zheng
cs.LG updates on arXiv.org arxiv.org
The monotone variational inequality is a central problem in mathematical
programming that unifies and generalizes many important settings such as smooth
convex optimization, two-player zero-sum games, convex-concave saddle point
problems, etc. The extragradient algorithm by Korpelevich [1976] and the
optimistic gradient descent-ascent algorithm by Popov [1980] are arguably the
two most classical and popular methods for solving monotone variational
inequalities. Despite its long history, the following major problem remains
open. What is the last-iterate convergence rate of the extragradient algorithm …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data Analyst, Patagonia Action Works
@ Patagonia | Remote
Data & Insights Strategy & Innovation General Manager
@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX
Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis
@ Ahmedabad University | Ahmedabad, India
Director, Applied Mathematics & Computational Research Division
@ Lawrence Berkeley National Lab | Berkeley, Ca
Business Data Analyst
@ MainStreet Family Care | Birmingham, AL
Assistant/Associate Professor of the Practice in Business Analytics
@ Georgetown University McDonough School of Business | Washington DC