all AI news
Onboard Out-of-Calibration Detection of Deep Learning Models using Conformal Prediction
May 7, 2024, 4:42 a.m. | Protim Bhattacharjee, Peter Jung
cs.LG updates on arXiv.org arxiv.org
Abstract: The black box nature of deep learning models complicate their usage in critical applications such as remote sensing. Conformal prediction is a method to ensure trust in such scenarios. Subject to data exchangeability, conformal prediction provides finite sample coverage guarantees in the form of a prediction set that is guaranteed to contain the true class within a user defined error rate. In this letter we show that conformal prediction algorithms are related to the uncertainty …
abstract applications arxiv black box box calibration coverage cs.ai cs.lg data deep learning detection form nature prediction sample sensing trust type usage
More from arxiv.org / cs.LG updates on arXiv.org
Efficient Data-Driven MPC for Demand Response of Commercial Buildings
2 days, 19 hours ago |
arxiv.org
Testing the Segment Anything Model on radiology data
2 days, 19 hours ago |
arxiv.org
Calorimeter shower superresolution
2 days, 19 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US