all AI news
The Power of Training: How Different Neural Network Setups Influence the Energy Demand
March 19, 2024, 4:45 a.m. | Daniel Gei{\ss}ler, Bo Zhou, Mengxi Liu, Sungho Suh, Paul Lukowicz
cs.LG updates on arXiv.org arxiv.org
Abstract: This work offers a heuristic evaluation of the effects of variations in machine learning training regimes and learning paradigms on the energy consumption of computing, especially HPC hardware with a life-cycle aware perspective. While increasing data availability and innovation in high-performance hardware fuels the training of sophisticated models, it also fosters the fading perception of energy consumption and carbon emission. Therefore, the goal of this work is to raise awareness about the energy impact of …
abstract arxiv availability computing consumption cs.ai cs.lg cs.pf data demand effects energy evaluation hardware hpc influence innovation life machine machine learning network neural network performance perspective power training type work
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Intern Large Language Models Planning (f/m/x)
@ BMW Group | Munich, DE
Data Engineer Analytics
@ Meta | Menlo Park, CA | Remote, US