May 12, 2022, 1:11 a.m. | Dmitry Kovalev, Alexander Gasnikov

cs.LG updates on arXiv.org arxiv.org

In this paper, we revisit the smooth and strongly-convex-strongly-concave
minimax optimization problem. Zhang et al. (2021) and Ibrahim et al. (2020)
established the lower bound $\Omega\left(\sqrt{\kappa_x\kappa_y} \log
\frac{1}{\epsilon}\right)$ on the number of gradient evaluations required to
find an $\epsilon$-accurate solution, where $\kappa_x$ and $\kappa_y$ are
condition numbers for the strong convexity and strong concavity assumptions.
However, the existing state-of-the-art methods do not match this lower bound:
algorithms of Lin et al. (2020) and Wang and Li (2020) have gradient evaluation …

algorithm arxiv math minimax optimization

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Staff Software Engineer, Generative AI, Google Cloud AI

@ Google | Mountain View, CA, USA; Sunnyvale, CA, USA

Expert Data Sciences

@ Gainwell Technologies | Any city, CO, US, 99999