all AI news
Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model. (arXiv:2211.02001v1 [cs.LG])
Nov. 4, 2022, 1:12 a.m. | Alexandra Sasha Luccioni, Sylvain Viguier, Anne-Laure Ligozat
cs.LG updates on arXiv.org arxiv.org
Progress in machine learning (ML) comes with a cost to the environment, given
that training ML models requires significant computational resources, energy
and materials. In the present article, we aim to quantify the carbon footprint
of BLOOM, a 176-billion parameter language model, across its life cycle. We
estimate that BLOOM's final training emitted approximately 24.7 tonnes
of~\carboneq~if we consider only the dynamic power consumption, and 50.5 tonnes
if we account for all processes ranging from equipment manufacturing to
energy-based operational …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Engineer
@ Parker | New York City
Sr. Data Analyst | Home Solutions
@ Three Ships | Raleigh or Charlotte, NC