Aug. 12, 2022, 1:11 a.m. | Antoine Godichon-Baggioni (LPSM (UMR\_8001)), Olivier Wintenberger (LPSM (UMR\_8001)), Nicklas Werge (LPSM (UMR\_8001))

stat.ML updates on arXiv.org arxiv.org

We consider the stochastic approximation problem in a streaming framework
where an objective is minimized through unbiased estimates of its gradients. In
this streaming framework, we consider time-varying data streams that must be
processed sequentially. Our methods are Stochastic Gradient (SG) based due to
their applicability and computational advantages. We provide a non-asymptotic
analysis of the convergence of various SG-based methods; this includes the
famous SG descent (a.k.a. Robbins-Monro algorithm), constant and time-varying
mini-batch SG methods, and their averaged estimates …

algorithms analysis approximation arxiv data lg stochastic streaming streaming data

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Associate

@ EcoVadis | Ebène, Mauritius

Senior Data Engineer

@ Telstra | Telstra ICC Bengaluru