all AI news
First Order Methods with Markovian Noise: from Acceleration to Variational Inequalities
April 2, 2024, 7:44 p.m. | Aleksandr Beznosikov, Sergey Samsonov, Marina Sheshukova, Alexander Gasnikov, Alexey Naumov, Eric Moulines
cs.LG updates on arXiv.org arxiv.org
Abstract: This paper delves into stochastic optimization problems that involve Markovian noise. We present a unified approach for the theoretical analysis of first-order gradient methods for stochastic optimization and variational inequalities. Our approach covers scenarios for both non-convex and strongly convex minimization problems. To achieve an optimal (linear) dependence on the mixing time of the underlying noise sequence, we use the randomized batching scheme, which is based on the multilevel Monte Carlo method. Moreover, our technique …
abstract analysis arxiv cs.lg gradient math.oc noise optimization paper stat.ml stochastic type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Software Engineer, Machine Learning (Tel Aviv)
@ Meta | Tel Aviv, Israel
Senior Data Scientist- Digital Government
@ Oracle | CASABLANCA, Morocco