all AI news
Pruning has a disparate impact on model accuracy. (arXiv:2205.13574v3 [cs.LG] UPDATED)
Oct. 14, 2022, 1:13 a.m. | Cuong Tran, Ferdinando Fioretto, Jung-Eun Kim, Rakshit Naidu
cs.LG updates on arXiv.org arxiv.org
Network pruning is a widely-used compression technique that is able to
significantly scale down overparameterized models with minimal loss of
accuracy. This paper shows that pruning may create or exacerbate disparate
impacts. The paper sheds light on the factors to cause such disparities,
suggesting differences in gradient norms and distance to decision boundary
across groups to be responsible for this critical issue. It analyzes these
factors in detail, providing both theoretical and empirical support, and
proposes a simple, yet effective, …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Senior Applied Data Scientist
@ dunnhumby | London
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV