Web: http://arxiv.org/abs/2110.09192

May 9, 2022, 1:11 a.m. | David Stutz, Krishnamurthy (Dj) Dvijotham, Ali Taylan Cemgil, Arnaud Doucet

cs.LG updates on arXiv.org arxiv.org

Modern deep learning based classifiers show very high accuracy on test data
but this does not provide sufficient guarantees for safe deployment, especially
in high-stake AI applications such as medical diagnosis. Usually, predictions
are obtained without a reliable uncertainty estimate or a formal guarantee.
Conformal prediction (CP) addresses these issues by using the classifier's
predictions, e.g., its probability estimates, to predict confidence sets
containing the true class with a user-specified probability. However, using CP
as a separate processing step after …

arxiv learning

More from arxiv.org / cs.LG updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC