April 11, 2024, 4:41 a.m. | Jae-Won Chung, Mosharaf Chowdhury

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.06675v1 Announce Type: new
Abstract: The enormous energy consumption of machine learning (ML) and generative AI workloads shows no sign of waning, taking a toll on operating costs, power delivery, and environmental sustainability. Despite a long line of research on energy-efficient hardware, we found that software plays a critical role in ML energy optimization through two recent works: Zeus and Perseus. This is especially true for large language models (LLMs) because their model sizes and, therefore, energy demands are growing …

abstract ai workloads arxiv consumption costs cs.ar cs.dc cs.lg delivery energy environmental environmental sustainability found generative hardware layer learning systems line machine machine learning power research role shows software sustainability systems type workloads

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York