July 21, 2023, 10 a.m. | Editorial Team

insideBIGDATA insidebigdata.com

In this video presentation, Aleksa Gordić explains what it takes to scale ML models up to trillions of parameters! He covers the fundamental ideas behind all of the recent big ML models like Meta's OPT-175B, BigScience BLOOM 176B, EleutherAI's GPT-NeoX-20B, GPT-J, OpenAI's GPT-3, Google's PaLM, DeepMind's Chinchilla/Gopher models, etc.

ai ai deep learning analysis big big data bigscience bloom data science deep learning deepspeed education eleutherai google news feed gpt gpt-3 gpt-j guide highlights ideas machine learning main feature megatron meta mixed ml models openai opt-175b precision presentation research highlight scale scaling training video weekly newsletter articles

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120