all AI news
Gradient-Regularized Out-of-Distribution Detection
April 19, 2024, 4:42 a.m. | Sina Sharifi, Taha Entesari, Bardia Safaei, Vishal M. Patel, Mahyar Fazlyab
cs.LG updates on arXiv.org arxiv.org
Abstract: One of the challenges for neural networks in real-life applications is the overconfident errors these models make when the data is not from the original training distribution.
Addressing this issue is known as Out-of-Distribution (OOD) detection.
Many state-of-the-art OOD methods employ an auxiliary dataset as a surrogate for OOD data during training to achieve improved performance.
However, these methods fail to fully exploit the local information embedded in the auxiliary dataset.
In this work, we …
abstract applications art arxiv challenges cs.cv cs.lg data dataset detection distribution errors gradient issue life networks neural networks state training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US