April 2, 2024, 7:41 p.m. | Ye Qiao, Haocheng Xu, Sitao Huang

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.00271v1 Announce Type: new
Abstract: Neural architecture search (NAS) is an effective method for discovering new convolutional neural network (CNN) architectures. However, existing approaches often require time-consuming training or intensive sampling and evaluations. Zero-shot NAS aims to create training-free proxies for architecture performance prediction. However, existing proxies have suboptimal performance, and are often outperformed by simple metrics such as model parameter counts or the number of floating-point operations. Besides, existing model-based proxies cannot be generalized to new search spaces with …

abstract architecture architectures arxiv cnn convolution convolutional neural network cost cs.ai cs.lg free graph however nas network networks neural architecture search neural network performance prediction proxies sampling search training transformer type zero-shot

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Engineer

@ Quantexa | Sydney, New South Wales, Australia

Staff Analytics Engineer

@ Warner Bros. Discovery | NY New York 230 Park Avenue South