July 21, 2023, 10 a.m. | Editorial Team

insideBIGDATA insidebigdata.com

In this video presentation, Aleksa Gordić explains what it takes to scale ML models up to trillions of parameters! He covers the fundamental ideas behind all of the recent big ML models like Meta's OPT-175B, BigScience BLOOM 176B, EleutherAI's GPT-NeoX-20B, GPT-J, OpenAI's GPT-3, Google's PaLM, DeepMind's Chinchilla/Gopher models, etc.

ai ai deep learning analysis big big data bigscience bloom data science deep learning deepspeed education eleutherai google news feed gpt gpt-3 gpt-j guide highlights ideas machine learning main feature megatron meta mixed ml models openai opt-175b precision presentation research highlight scale scaling training video weekly newsletter articles

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US