April 2, 2024, 7:52 p.m. | Vivian Liu, Yiqiao Yin

cs.CL updates on arXiv.org arxiv.org

arXiv:2404.01157v1 Announce Type: new
Abstract: Prominent works in the field of Natural Language Processing have long attempted to create new innovative models by improving upon previous model training approaches, altering model architecture, and developing more in-depth datasets to better their performance. However, with the quickly advancing field of NLP comes increased greenhouse gas emissions, posing concerns over the environmental damage caused by training LLMs. Gaining a comprehensive understanding of the various costs, particularly those pertaining to environmental aspects, that are …

abstract architecture arxiv carbon cs.cl cs.pf datasets green green ai however improving language language model language model training language processing large language large language model natural natural language natural language processing performance processing strategies trade training type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne