Feb. 2, 2024, 12:41 p.m. | Frederik Bussler

Towards AI - Medium pub.towardsai.net

Photo by Agence Olloweb on Unsplash

AI models have grown massively in size in recent years, with models like GPT-3 containing over 175 billion parameters (and an estimated 1 trillion in GPT-4).

However, these colossal models come with downsides — they require substantial computing power, have high operating costs, and can perpetuate harmful biases if not carefully monitored.

In response, there has been a renewed interest in smaller AI models. Weighing in at just 1.8 billion parameters, the newly released …

ai ai models artificial intelligence billion case computing computing power costs gpt gpt-3 gpt-4 h2o h2o.ai llm machine learning parameters photo power

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

GN SONG MT Market Research Data Analyst 11

@ Accenture | Bengaluru, BDC7A

GN SONG MT Market Research Data Analyst 09

@ Accenture | Bengaluru, BDC7A