all AI news
Random Search as a Baseline for Sparse Neural Network Architecture Search
March 14, 2024, 4:41 a.m. | Rezsa Farahani
cs.LG updates on arXiv.org arxiv.org
Abstract: Sparse neural networks have shown similar or better generalization performance than their dense counterparts while having higher parameter efficiency. This has motivated a number of works to learn, induce, or search for high performing sparse networks. While reports of quality or efficiency gains are impressive, standard baselines are lacking, therefore hindering having reliable comparability and reproducibility across methods. In this work, we provide an evaluation approach and a naive Random Search baseline method for finding …
abstract architecture arxiv cs.ai cs.lg cs.ne efficiency learn network network architecture networks neural network neural networks performance quality random reports search type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Technical Program Manager, Expert AI Trainer Acquisition & Engagement
@ OpenAI | San Francisco, CA
Director, Data Engineering
@ PatientPoint | Cincinnati, Ohio, United States