Web: http://arxiv.org/abs/2205.05653

May 12, 2022, 1:11 a.m. | Dmitry Kovalev, Alexander Gasnikov

cs.LG updates on arXiv.org arxiv.org

In this paper, we revisit the smooth and strongly-convex-strongly-concave
minimax optimization problem. Zhang et al. (2021) and Ibrahim et al. (2020)
established the lower bound $\Omega\left(\sqrt{\kappa_x\kappa_y} \log
\frac{1}{\epsilon}\right)$ on the number of gradient evaluations required to
find an $\epsilon$-accurate solution, where $\kappa_x$ and $\kappa_y$ are
condition numbers for the strong convexity and strong concavity assumptions.
However, the existing state-of-the-art methods do not match this lower bound:
algorithms of Lin et al. (2020) and Wang and Li (2020) have gradient evaluation …

algorithm arxiv math minimax optimization

More from arxiv.org / cs.LG updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California