all AI news
Learn then Test: Calibrating Predictive Algorithms to Achieve Risk Control. (arXiv:2110.01052v4 [cs.LG] UPDATED)
April 29, 2022, 1:12 a.m. | Anastasios N. Angelopoulos, Stephen Bates, Emmanuel J. Candès, Michael I. Jordan, Lihua Lei
cs.LG updates on arXiv.org arxiv.org
We introduce a framework for calibrating machine learning models so that
their predictions satisfy explicit, finite-sample statistical guarantees. Our
calibration algorithm works with any underlying model and (unknown)
data-generating distribution and does not require model refitting. The
framework addresses, among other examples, false discovery rate control in
multi-label classification, intersection-over-union control in instance
segmentation, and the simultaneous control of the type-1 error of outlier
detection and confidence set coverage in classification or regression. Our main
insight is to reframe the …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
[Job - 14823] Senior Data Scientist (Data Analyst Sr)
@ CI&T | Brazil
Data Engineer
@ WorldQuant | Hanoi
ML Engineer / Toronto
@ Intersog | Toronto, Ontario, Canada
Analista de Business Intelligence (Industry Insights)
@ NielsenIQ | Cotia, Brazil