March 25, 2024, 4:42 a.m. | Pedram Bakhtiarifard, Christian Igel, Raghavendra Selvan

cs.LG updates on arXiv.org arxiv.org

arXiv:2210.06015v4 Announce Type: replace
Abstract: Energy consumption from the selection, training, and deployment of deep learning models has seen a significant uptick recently. This work aims to facilitate the design of energy-efficient deep learning models that require less computational resources and prioritize environmental sustainability by focusing on the energy consumption. Neural architecture search (NAS) benefits from tabular benchmarks, which evaluate NAS strategies cost-effectively through precomputed performance statistics. We advocate for including energy efficiency as an additional performance criterion in NAS. …

abstract architecture arxiv benchmarks computational consumption cs.lg deep learning deployment design energy environmental environmental sustainability nas neural architecture search resources search stat.ml sustainability tabular training type work

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist (Computer Science)

@ Nanyang Technological University | NTU Main Campus, Singapore

Intern - Sales Data Management

@ Deliveroo | Dubai, UAE (Main Office)