Nov. 3, 2022, 1:12 a.m. | William T. Stephenson, Zachary Frangella, Madeleine Udell, Tamara Broderick

cs.LG updates on arXiv.org arxiv.org

Models like LASSO and ridge regression are extensively used in practice due
to their interpretability, ease of use, and strong theoretical guarantees.
Cross-validation (CV) is widely used for hyperparameter tuning in these models,
but do practical optimization methods minimize the true out-of-sample loss? A
recent line of research promises to show that the optimum of the CV loss
matches the optimum of the out-of-sample loss (possibly after simple
corrections). It remains to show how tractable it is to minimize the …

arxiv loss regression ridge validation validation loss

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Software Engineer, Generative AI (C++)

@ SoundHound Inc. | Toronto, Canada