Feb. 2, 2024, 12:41 p.m. | Frederik Bussler

Towards AI - Medium pub.towardsai.net

Photo by Agence Olloweb on Unsplash

AI models have grown massively in size in recent years, with models like GPT-3 containing over 175 billion parameters (and an estimated 1 trillion in GPT-4).

However, these colossal models come with downsides — they require substantial computing power, have high operating costs, and can perpetuate harmful biases if not carefully monitored.

In response, there has been a renewed interest in smaller AI models. Weighing in at just 1.8 billion parameters, the newly released …

ai ai models artificial intelligence billion case computing computing power costs gpt gpt-3 gpt-4 h2o h2o.ai llm machine learning parameters photo power

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne