all AI news
Improving Out-of-Distribution Detection via Epistemic Uncertainty Adversarial Training. (arXiv:2209.03148v2 [cs.LG] UPDATED)
Sept. 12, 2022, 1:12 a.m. | Derek Everett, Andre T. Nguyen, Luke E. Richards, Edward Raff
cs.LG updates on arXiv.org arxiv.org
The quantification of uncertainty is important for the adoption of machine
learning, especially to reject out-of-distribution (OOD) data back to human
experts for review. Yet progress has been slow, as a balance must be struck
between computational efficiency and the quality of uncertainty estimates. For
this reason many use deep ensembles of neural networks or Monte Carlo dropout
for reasonable uncertainty estimates at relatively minimal compute and memory.
Surprisingly, when we focus on the real-world applicable constraint of $\leq
1\%$ …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analytics & Insight Specialist, Customer Success
@ Fortinet | Ottawa, ON, Canada
Account Director, ChatGPT Enterprise - Majors
@ OpenAI | Remote - Paris