March 18, 2024, 4:41 a.m. | Tim G. J. Rudner, Ya Shi Zhang, Andrew Gordon Wilson, Julia Kempe

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.09869v1 Announce Type: cross
Abstract: Machine learning models often perform poorly under subpopulation shifts in the data distribution. Developing methods that allow machine learning models to better generalize to such shifts is crucial for safe deployment in real-world settings. In this paper, we develop a family of group-aware prior (GAP) distributions over neural network parameters that explicitly favor models that generalize well under subpopulation shifts. We design a simple group-aware prior that only requires access to a small set of …

abstract arxiv cs.ai cs.lg data deployment distribution family gap machine machine learning machine learning models mind paper prior robustness stat.me stat.ml type world

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Engineer

@ Quantexa | Sydney, New South Wales, Australia

Staff Analytics Engineer

@ Warner Bros. Discovery | NY New York 230 Park Avenue South