all AI news
Surprisingly Strong Performance Prediction with Neural Graph Features
April 26, 2024, 4:41 a.m. | Gabriela Kadlecov\'a, Jovita Lukasik, Martin Pil\'at, Petra Vidnerov\'a, Mahmoud Safari, Roman Neruda, Frank Hutter
cs.LG updates on arXiv.org arxiv.org
Abstract: Performance prediction has been a key part of the neural architecture search (NAS) process, allowing to speed up NAS algorithms by avoiding resource-consuming network training. Although many performance predictors correlate well with ground truth performance, they require training data in the form of trained networks. Recently, zero-cost proxies have been proposed as an efficient method to estimate network performance without any training. However, they are still poorly understood, exhibit biases with network properties, and their …
abstract algorithms architecture arxiv cost cs.lg data features form graph key nas network networks network training neural architecture search part performance prediction process search speed training training data truth type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Data Scientist (Database Development)
@ Nasdaq | Bengaluru-Affluence