Web: http://arxiv.org/abs/2201.06656

Jan. 27, 2022, 2:11 a.m. | Leo Kozachkov, Patrick M. Wensing, Jean-Jacques Slotine

cs.LG updates on arXiv.org arxiv.org

We prove that Riemannian contraction in a supervised learning setting implies
generalization. Specifically, we show that if an optimizer is contracting in
some Riemannian metric with rate $\lambda > 0$, it is uniformly algorithmically
stable with rate $\mathcal{O}(1/\lambda n)$, where $n$ is the number of
labelled examples in the training set. The results hold for stochastic and
deterministic optimization, in both continuous and discrete-time, for convex
and non-convex loss surfaces. The associated generalization bounds reduce to
well-known results in the …

arxiv learning supervised learning

More from arxiv.org / cs.LG updates on arXiv.org

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY

Data Analyst

@ Colorado Springs Police Department | Colorado Springs, CO

Predictive Ecology Postdoctoral Fellow

@ Lawrence Berkeley National Lab | Berkeley, CA

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX