March 13, 2024, 4:42 a.m. | Alexander Timans, Christoph-Nikolas Straehle, Kaspar Sakmann, Eric Nalisnick

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.07263v1 Announce Type: cross
Abstract: Quantifying a model's predictive uncertainty is essential for safety-critical applications such as autonomous driving. We consider quantifying such uncertainty for multi-object detection. In particular, we leverage conformal prediction to obtain uncertainty intervals with guaranteed coverage for object bounding boxes. One challenge in doing so is that bounding box predictions are conditioned on the object's class label. Thus, we develop a novel two-step conformal approach that propagates uncertainty in predicted class labels into the uncertainty intervals …

abstract applications arxiv autonomous autonomous driving box challenge coverage cs.cv cs.lg detection driving object prediction predictive safety safety-critical stat.ml type uncertainty via

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Modeler

@ Sherwin-Williams | Cleveland, OH, United States