all AI news
Surprisingly Strong Performance Prediction with Neural Graph Features
April 26, 2024, 4:41 a.m. | Gabriela Kadlecov\'a, Jovita Lukasik, Martin Pil\'at, Petra Vidnerov\'a, Mahmoud Safari, Roman Neruda, Frank Hutter
cs.LG updates on arXiv.org arxiv.org
Abstract: Performance prediction has been a key part of the neural architecture search (NAS) process, allowing to speed up NAS algorithms by avoiding resource-consuming network training. Although many performance predictors correlate well with ground truth performance, they require training data in the form of trained networks. Recently, zero-cost proxies have been proposed as an efficient method to estimate network performance without any training. However, they are still poorly understood, exhibit biases with network properties, and their …
abstract algorithms architecture arxiv cost cs.lg data features form graph key nas network networks network training neural architecture search part performance prediction process search speed training training data truth type
More from arxiv.org / cs.LG updates on arXiv.org
Sliced Wasserstein with Random-Path Projecting Directions
1 day, 5 hours ago |
arxiv.org
The Un-Kidnappable Robot: Acoustic Localization of Sneaking People
1 day, 5 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York