Feb. 26, 2024, 5:43 a.m. | Jonathan Kelner, Frederic Koehler, Raghu Meka, Dhruv Rohatgi

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.15409v1 Announce Type: cross
Abstract: It is well-known that the statistical performance of Lasso can suffer significantly when the covariates of interest have strong correlations. In particular, the prediction error of Lasso becomes much worse than computationally inefficient alternatives like Best Subset Selection. Due to a large conjectured computational-statistical tradeoff in the problem of sparse linear regression, it may be impossible to close this gap in general.
In this work, we propose a natural sparse linear regression setting where strong …

abstract arxiv computational correlations cs.cc cs.ds cs.lg error lasso math.st performance prediction statistical stat.ml stat.th type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Alternance DATA/AI Engineer (H/F)

@ SQLI | Le Grand-Quevilly, France