March 20, 2024, 4:41 a.m. | Raghavendra Selvan, Bob Pepin, Christian Igel, Gabrielle Samuel, Erik B Dam

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.12562v1 Announce Type: new
Abstract: The recent advances in deep learning (DL) have been accelerated by access to large-scale data and compute. These large-scale resources have been used to train progressively larger models which are resource intensive in terms of compute, data, energy, and carbon emissions. These costs are becoming a new type of entry barrier to researchers and practitioners with limited access to resources at such scale, particularly in the Global South. In this work, we take a comprehensive …

abstract advances arxiv carbon case compute costs cs.ai cs.lg data deep learning emissions energy equity larger models resources scale small stat.ml terms through train type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US