all AI news
Universal Online Learning with Gradient Variations: A Multi-layer Online Ensemble Approach
April 17, 2024, 4:43 a.m. | Yu-Hu Yan, Peng Zhao, Zhi-Hua Zhou
cs.LG updates on arXiv.org arxiv.org
Abstract: In this paper, we propose an online convex optimization approach with two different levels of adaptivity. On a higher level, our approach is agnostic to the unknown types and curvatures of the online functions, while at a lower level, it can exploit the unknown niceness of the environments and attain problem-dependent guarantees. Specifically, we obtain $\mathcal{O}(\log V_T)$, $\mathcal{O}(d \log V_T)$ and $\hat{\mathcal{O}}(\sqrt{V_T})$ regret bounds for strongly convex, exp-concave and convex loss functions, respectively, where $d$ …
abstract arxiv cs.lg ensemble exploit functions gradient layer math.oc online learning optimization paper stat.ml the unknown type types universal
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Data Engineer
@ Cint | Gurgaon, India
Data Science (M/F), setor automóvel - Aveiro
@ Segula Technologies | Aveiro, Portugal