April 24, 2023, 12:46 a.m. | Olivier Wintenberger (LPSM (UMR\_8001))

cs.LG updates on arXiv.org arxiv.org

We introduce a general framework of stochastic online convex optimization to
obtain fast-rate stochastic regret bounds. We prove that algorithms such as
online newton steps and a scale-free 10 version of Bernstein online aggregation
achieve best-known rates in unbounded stochastic settings. We apply our
approach to calibrate parametric probabilistic forecasters of non-stationary
sub-gaussian time series. Our fast-rate stochastic regret bounds are any-time
valid. Our proofs combine self-bounded and Poissonnian inequalities for
martingales and sub-gaussian random variables, respectively, under a stochastic …

aggregation algorithms application apply arxiv forecasting framework free general optimization parametric random rate scale series stochastic time series time series forecasting variables

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Engineer

@ Contact Government Services | Trenton, NJ

Data Engineer

@ Comply365 | Bristol, UK

Masterarbeit: Deep learning-basierte Fehler Detektion bei Montageaufgaben

@ Fraunhofer-Gesellschaft | Karlsruhe, DE, 76131

Assistant Manager ETL testing 1

@ KPMG India | Bengaluru, Karnataka, India