March 13, 2024, 4:42 a.m. | Konstantinos Emmanouilidis, Ren\'e Vidal, Nicolas Loizou

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.07148v1 Announce Type: cross
Abstract: The Stochastic Extragradient (SEG) method is one of the most popular algorithms for solving finite-sum min-max optimization and variational inequality problems (VIPs) appearing in various machine learning tasks. However, existing convergence analyses of SEG focus on its with-replacement variants, while practical implementations of the method randomly reshuffle components and sequentially use them. Unlike the well-studied with-replacement variants, SEG with Random Reshuffling (SEG-RR) lacks established theoretical guarantees. In this work, we provide a convergence analysis of …

abstract algorithms arxiv convergence cs.gt cs.lg focus however inequality machine machine learning math.oc max optimization popular practical random replacement stat.ml stochastic tasks type variants

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120