all AI news
Concentration inequalities for leave-one-out cross validation. (arXiv:2211.02478v1 [math.ST])
Nov. 7, 2022, 2:11 a.m. | Benny Avelin, Lauri Viitasaari
cs.LG updates on arXiv.org arxiv.org
In this article we prove that estimator stability is enough to show that
leave-one-out cross validation is a sound procedure, by providing concentration
bounds in a general framework. In particular, we provide concentration bounds
beyond Lipschitz continuity assumptions on the loss or on the estimator. In
order to obtain our results, we rely on random variables with distribution
satisfying the logarithmic Sobolev inequality, providing us a relatively rich
class of distributions. We illustrate our method by considering several
interesting examples, …
More from arxiv.org / cs.LG updates on arXiv.org
Testing the Segment Anything Model on radiology data
1 day, 2 hours ago |
arxiv.org
Calorimeter shower superresolution
1 day, 2 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US