all AI news
H2O.ai’s Danube and The Case for Smaller, More Accessible AI Models
Towards AI - Medium pub.towardsai.net
AI models have grown massively in size in recent years, with models like GPT-3 containing over 175 billion parameters (and an estimated 1 trillion in GPT-4).
However, these colossal models come with downsides — they require substantial computing power, have high operating costs, and can perpetuate harmful biases if not carefully monitored.
In response, there has been a renewed interest in smaller AI models. Weighing in at just 1.8 billion parameters, the newly released …
ai ai models artificial intelligence billion case computing computing power costs gpt gpt-3 gpt-4 h2o h2o.ai llm machine learning parameters photo power