March 13, 2024, 4:42 a.m. | Konstantinos Emmanouilidis, Ren\'e Vidal, Nicolas Loizou

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.07148v1 Announce Type: cross
Abstract: The Stochastic Extragradient (SEG) method is one of the most popular algorithms for solving finite-sum min-max optimization and variational inequality problems (VIPs) appearing in various machine learning tasks. However, existing convergence analyses of SEG focus on its with-replacement variants, while practical implementations of the method randomly reshuffle components and sequentially use them. Unlike the well-studied with-replacement variants, SEG with Random Reshuffling (SEG-RR) lacks established theoretical guarantees. In this work, we provide a convergence analysis of …

abstract algorithms arxiv convergence cs.gt cs.lg focus however inequality machine machine learning math.oc max optimization popular practical random replacement stat.ml stochastic tasks type variants

Senior Data Engineer

@ Displate | Warsaw

Junior Data Analyst - ESG Data

@ Institutional Shareholder Services | Mumbai

Intern Data Driven Development in Sensor Fusion for Autonomous Driving (f/m/x)

@ BMW Group | Munich, DE

Senior MLOps Engineer, Machine Learning Platform

@ GetYourGuide | Berlin

Data Engineer, Analytics

@ Meta | Menlo Park, CA

Data Engineer

@ Meta | Menlo Park, CA