Jan. 3, 2022, 2:10 a.m. | Ziyi Chen, Shaocong Ma, Yi Zhou

cs.LG updates on arXiv.org arxiv.org

Alternating gradient-descent-ascent (AltGDA) is an optimization algorithm
that has been widely used for model training in various machine learning
applications, which aim to solve a nonconvex minimax optimization problem.
However, the existing studies show that it suffers from a high computation
complexity in nonconvex minimax optimization. In this paper, we develop a
single-loop and fast AltGDA-type algorithm that leverages proximal gradient
updates and momentum acceleration to solve regularized nonconvex minimax
optimization problems. By identifying the intrinsic Lyapunov function of this …

arxiv gradient gradient-descent learning machine machine learning minimax

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

[Job - 14823] Senior Data Scientist (Data Analyst Sr)

@ CI&T | Brazil

Data Engineer

@ WorldQuant | Hanoi

ML Engineer / Toronto

@ Intersog | Toronto, Ontario, Canada

Analista de Business Intelligence (Industry Insights)

@ NielsenIQ | Cotia, Brazil