all AI news
Can we globally optimize cross-validation loss? Quasiconvexity in ridge regression. (arXiv:2107.09194v2 [stat.ML] UPDATED)
Nov. 3, 2022, 1:12 a.m. | William T. Stephenson, Zachary Frangella, Madeleine Udell, Tamara Broderick
cs.LG updates on arXiv.org arxiv.org
Models like LASSO and ridge regression are extensively used in practice due
to their interpretability, ease of use, and strong theoretical guarantees.
Cross-validation (CV) is widely used for hyperparameter tuning in these models,
but do practical optimization methods minimize the true out-of-sample loss? A
recent line of research promises to show that the optimum of the CV loss
matches the optimum of the out-of-sample loss (possibly after simple
corrections). It remains to show how tractable it is to minimize the …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada