all AI news
Learning with little mixing. (arXiv:2206.08269v1 [cs.LG])
Web: http://arxiv.org/abs/2206.08269
June 17, 2022, 1:12 a.m. | Ingvar Ziemann, Stephen Tu
stat.ML updates on arXiv.org arxiv.org
We study square loss in a realizable time-series framework with martingale
difference noise. Our main result is a fast rate excess risk bound which shows
that whenever a trajectory hypercontractivity condition holds, the risk of the
least-squares estimator on dependent data matches the iid rate order-wise after
a burn-in time. In comparison, many existing results in learning from dependent
data have rates where the effective sample size is deflated by a factor of the
mixing-time of the underlying process, even …
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY