April 15, 2024, 4:41 a.m. | Longwei Zou, Han Zhang, Yangdong Deng

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.07999v1 Announce Type: new
Abstract: The fast growing capabilities of large-scale deep learning models, such as Bert, GPT and ViT, are revolutionizing the landscape of NLP, CV and many other domains. Training such models, however, poses an unprecedented demand for computing power, which incurs exponentially increasing energy cost and carbon dioxide emissions. It is thus critical to develop efficient training solutions to reduce the training costs. Motivated by a set of key observations of inter- and intra-layer similarities among feature …

abstract arxiv bert capabilities carbon carbon dioxide computing computing power cost cs.cl cs.lg deep learning demand domains emissions energy framework gpt however landscape nlp power scale training transformer transformer models type vit

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Scientist

@ ITE Management | New York City, United States