March 20, 2024, 4:41 a.m. | Raghavendra Selvan, Bob Pepin, Christian Igel, Gabrielle Samuel, Erik B Dam

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.12562v1 Announce Type: new
Abstract: The recent advances in deep learning (DL) have been accelerated by access to large-scale data and compute. These large-scale resources have been used to train progressively larger models which are resource intensive in terms of compute, data, energy, and carbon emissions. These costs are becoming a new type of entry barrier to researchers and practitioners with limited access to resources at such scale, particularly in the Global South. In this work, we take a comprehensive …

abstract advances arxiv carbon case compute costs cs.ai cs.lg data deep learning emissions energy equity larger models resources scale small stat.ml terms through train type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineer, Machine Learning (Tel Aviv)

@ Meta | Tel Aviv, Israel

Senior Data Scientist- Digital Government

@ Oracle | CASABLANCA, Morocco