Web: http://arxiv.org/abs/2003.03274

May 6, 2022, 1:11 a.m. | Kirill Fedyanin, Evgenii Tsymbalov, Maxim Panov

cs.LG updates on arXiv.org arxiv.org

Uncertainty estimation for machine learning models is of high importance in
many scenarios such as constructing the confidence intervals for model
predictions and detection of out-of-distribution or adversarially generated
points. In this work, we show that modifying the sampling distributions for
dropout layers in neural networks improves the quality of uncertainty
estimation. Our main idea consists of two main steps: computing data-driven
correlations between neurons and generating samples, which include maximally
diverse neurons. In a series of experiments on simulated …

arxiv diversity dropout sampling uncertainty

More from arxiv.org / cs.LG updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California