all AI news
Adaptive Bounding Box Uncertainties via Two-Step Conformal Prediction
March 13, 2024, 4:42 a.m. | Alexander Timans, Christoph-Nikolas Straehle, Kaspar Sakmann, Eric Nalisnick
cs.LG updates on arXiv.org arxiv.org
Abstract: Quantifying a model's predictive uncertainty is essential for safety-critical applications such as autonomous driving. We consider quantifying such uncertainty for multi-object detection. In particular, we leverage conformal prediction to obtain uncertainty intervals with guaranteed coverage for object bounding boxes. One challenge in doing so is that bounding box predictions are conditioned on the object's class label. Thus, we develop a novel two-step conformal approach that propagates uncertainty in predicted class labels into the uncertainty intervals …
abstract applications arxiv autonomous autonomous driving box challenge coverage cs.cv cs.lg detection driving object prediction predictive safety safety-critical stat.ml type uncertainty via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US