all AI news
Toward Cross-Layer Energy Optimizations in Machine Learning Systems
April 11, 2024, 4:41 a.m. | Jae-Won Chung, Mosharaf Chowdhury
cs.LG updates on arXiv.org arxiv.org
Abstract: The enormous energy consumption of machine learning (ML) and generative AI workloads shows no sign of waning, taking a toll on operating costs, power delivery, and environmental sustainability. Despite a long line of research on energy-efficient hardware, we found that software plays a critical role in ML energy optimization through two recent works: Zeus and Perseus. This is especially true for large language models (LLMs) because their model sizes and, therefore, energy demands are growing …
abstract ai workloads arxiv consumption costs cs.ar cs.dc cs.lg delivery energy environmental environmental sustainability found generative hardware layer learning systems line machine machine learning power research role shows software sustainability systems type workloads
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Sr. VBI Developer II
@ Atos | Texas, US, 75093
Wealth Management - Data Analytics Intern/Co-op Fall 2024
@ Scotiabank | Toronto, ON, CA