May 10, 2023, 12:59 p.m. | Mack DeGeurin

Gizmodo gizmodo.com

Popular large language models (LLMs) like OpenAI’s ChatGPT and Google’s Bard are energy-intensive, requiring massive server farms to provide enough data to train the powerful programs. Cooling those same data centers also makes the AI chatbots incredibly thirsty. New research suggests training for GPT-3 alone consumed…

Read more...

ai ai chatbots articles artificialintelligence bard chatbots chatgpt computationalneuroscience computing cooling data datacenter data centers energy environment google gpt gpt-3 lamda language language models large language models llms massive microsoft openai popular reactor research server study technology2cinternet training water

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Developer AI Senior Staff Engineer, Machine Learning

@ Google | Sunnyvale, CA, USA; New York City, USA

Engineer* Cloud & Data Operations (f/m/d)

@ SICK Sensor Intelligence | Waldkirch (bei Freiburg), DE, 79183