Web: http://arxiv.org/abs/2110.06435

June 20, 2022, 1:11 a.m. | Haichao Yu, Zhe Chen, Dong Lin, Gil Shamir, Jie Han

cs.LG updates on arXiv.org arxiv.org

Dropout has been commonly used to quantify prediction uncertainty, i.e, the
variations of model predictions on a given input example. However, using
dropout in practice can be expensive as it requires running dropout inferences
many times.

In this paper, we study how to estimate dropout prediction uncertainty in a
resource-efficient manner. We demonstrate that we can use neuron activation
strengths to estimate dropout prediction uncertainty under different dropout
settings and on a variety of tasks using three large datasets, MovieLens, …

arxiv dropout lg prediction uncertainty

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY