all AI news
EC-NAS: Energy Consumption Aware Tabular Benchmarks for Neural Architecture Search
March 25, 2024, 4:42 a.m. | Pedram Bakhtiarifard, Christian Igel, Raghavendra Selvan
cs.LG updates on arXiv.org arxiv.org
Abstract: Energy consumption from the selection, training, and deployment of deep learning models has seen a significant uptick recently. This work aims to facilitate the design of energy-efficient deep learning models that require less computational resources and prioritize environmental sustainability by focusing on the energy consumption. Neural architecture search (NAS) benefits from tabular benchmarks, which evaluate NAS strategies cost-effectively through precomputed performance statistics. We advocate for including energy efficiency as an additional performance criterion in NAS. …
abstract architecture arxiv benchmarks computational consumption cs.lg deep learning deployment design energy environmental environmental sustainability nas neural architecture search resources search stat.ml sustainability tabular training type work
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Research Scientist (Computer Science)
@ Nanyang Technological University | NTU Main Campus, Singapore
Intern - Sales Data Management
@ Deliveroo | Dubai, UAE (Main Office)