all AI news
Can pruning improve certified robustness of neural networks?. (arXiv:2206.07311v1 [cs.LG])
Web: http://arxiv.org/abs/2206.07311
June 16, 2022, 1:10 a.m. | Zhangheng Li, Tianlong Chen, Linyi Li, Bo Li, Zhangyang Wang
cs.LG updates on arXiv.org arxiv.org
With the rapid development of deep learning, the sizes of neural networks
become larger and larger so that the training and inference often overwhelm the
hardware resources. Given the fact that neural networks are often
over-parameterized, one effective way to reduce such computational overhead is
neural network pruning, by removing redundant parameters from trained neural
networks. It has been recently observed that pruning can not only reduce
computational overhead but also can improve empirical robustness of deep neural
networks (NNs), …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY