all AI news
Adaptive Bounding Box Uncertainties via Two-Step Conformal Prediction
March 13, 2024, 4:42 a.m. | Alexander Timans, Christoph-Nikolas Straehle, Kaspar Sakmann, Eric Nalisnick
cs.LG updates on arXiv.org arxiv.org
Abstract: Quantifying a model's predictive uncertainty is essential for safety-critical applications such as autonomous driving. We consider quantifying such uncertainty for multi-object detection. In particular, we leverage conformal prediction to obtain uncertainty intervals with guaranteed coverage for object bounding boxes. One challenge in doing so is that bounding box predictions are conditioned on the object's class label. Thus, we develop a novel two-step conformal approach that propagates uncertainty in predicted class labels into the uncertainty intervals …
abstract applications arxiv autonomous autonomous driving box challenge coverage cs.cv cs.lg detection driving object prediction predictive safety safety-critical stat.ml type uncertainty via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Lead Data Modeler
@ Sherwin-Williams | Cleveland, OH, United States