Sept. 6, 2022, 11:45 a.m. | Aleksa Gordić - The AI Epiphany

Aleksa Gordić - The AI Epiphany www.youtube.com

🚀 Find out how to get started using Weights & Biases 🚀
http://wandb.me/ai-epiphany

❤️ Become The AI Epiphany Patreon ❤️
https://www.patreon.com/theaiepiphany

In this video I cover 3 publicly shared LLM 🚀 projects/papers and the pain they experience training them (🍿🍿🍿):

1. "What Language Model to Train if You Have One Million GPU Hours?" introducing BLOOM 176 billion parameter model by BigScience!

2. "OPT: Open Pre-trained Transformer Language Models" introducing 175 billion parameter model OPT-175B!

3. "GPT-NeoX-20B: An Open-Source Autoregressive Language …

bigscience bloom explained gpt language language models large language models opt-175b training

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Machine Learning Engineer

@ Apple | Sunnyvale, California, United States