all AI news
Importance Tempering: Group Robustness for Overparameterized Models. (arXiv:2209.08745v2 [cs.LG] UPDATED)
Sept. 29, 2022, 1:13 a.m. | Yiping Lu, Wenlong Ji, Zachary Izzo, Lexing Ying
stat.ML updates on arXiv.org arxiv.org
Although overparameterized models have shown their success on many machine
learning tasks, the accuracy could drop on the testing distribution that is
different from the training one. This accuracy drop still limits applying
machine learning in the wild. At the same time, importance weighting, a
traditional technique to handle distribution shifts, has been demonstrated to
have less or even no effect on overparameterized models both empirically and
theoretically. In this paper, we propose importance tempering to improve the
decision boundary …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Analytics Engineer
@ CircleCI | Remote (US), Remote (Canada), San Francisco, Denver
Bilingual Executive Assistant/Data Analyst - (French and English) - Export
@ Dangote Group | Lagos, Lagos, Nigeria
Workday Services Data Lead
@ WPP | Mexico City, Mexico
Business Data Analyst
@ Nordea | Tallinn, EE, 11415
Data Integrity Lead
@ BioNTech SE | Gaithersburg, MD, US, MD 20878