Web: http://arxiv.org/abs/2202.09889

June 17, 2022, 1:11 a.m. | Chen Cheng, John Duchi, Rohith Kuditipudi

cs.LG updates on arXiv.org arxiv.org

We examine the necessity of interpolation in overparameterized models, that
is, when achieving optimal predictive risk in machine learning problems
requires (nearly) interpolating the training data. In particular, we consider
simple overparameterized linear regression $y = X \theta + w$ with random
design $X \in \mathbb{R}^{n \times d}$ under the proportional asymptotics $d/n
\to \gamma \in (1, \infty)$. We precisely characterize how prediction (test)
error necessarily scales with training error in this setting. An implication of
this characterization is that …

arxiv linear linear regression ml on regression

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY