Feb. 19, 2024, 5:42 a.m. | Steven Landgraf, Markus Hillemann, Theodor Kapler, Markus Ulrich

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.10580v1 Announce Type: cross
Abstract: Quantifying the predictive uncertainty emerged as a possible solution to common challenges like overconfidence or lack of explainability and robustness of deep neural networks, albeit one that is often computationally expensive. Many real-world applications are multi-modal in nature and hence benefit from multi-task learning. In autonomous driving, for example, the joint solution of semantic segmentation and monocular depth estimation has proven to be valuable. In this work, we first combine different uncertainty quantification methods with …

abstract applications arxiv benefit challenges cs.ai cs.cv cs.lg explainability modal multi-modal multi-task learning nature networks neural networks predictive robustness segmentation semantic solution type uncertainty world

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120