Oct. 25, 2022, 1:14 a.m. | Sebastian Gruber, Florian Buettner

stat.ML updates on arXiv.org arxiv.org

Reliably estimating the uncertainty of a prediction throughout the model
lifecycle is crucial in many safety-critical applications.


The most common way to measure this uncertainty is via the predicted
confidence. While this tends to work well for in-domain samples, these
estimates are unreliable under domain drift.


Alternatively, a bias-variance decomposition allows to directly measure the
predictive uncertainty across the entire input space.


But, such a decomposition for proper scores does not exist in current
literature, and for exponential families it …

arxiv bias bias-variance general predictions uncertainty variance

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain