all AI news
Stochastic Online Convex Optimization. Application to probabilistic time series forecasting. (arXiv:2102.00729v3 [cs.LG] UPDATED)
cs.LG updates on arXiv.org arxiv.org
We introduce a general framework of stochastic online convex optimization to
obtain fast-rate stochastic regret bounds. We prove that algorithms such as
online newton steps and a scale-free 10 version of Bernstein online aggregation
achieve best-known rates in unbounded stochastic settings. We apply our
approach to calibrate parametric probabilistic forecasters of non-stationary
sub-gaussian time series. Our fast-rate stochastic regret bounds are any-time
valid. Our proofs combine self-bounded and Poissonnian inequalities for
martingales and sub-gaussian random variables, respectively, under a stochastic …
aggregation algorithms application apply arxiv forecasting framework free general optimization parametric random rate scale series stochastic time series time series forecasting variables