Sept. 10, 2022, 3:39 p.m. | Aleksa Gordić - The AI Epiphany

Aleksa Gordić - The AI Epiphany www.youtube.com

❤️ Become The AI Epiphany Patreon ❤️
https://www.patreon.com/theaiepiphany

👨‍👩‍👧‍👦 Join our Discord community 👨‍👩‍👧‍👦
https://discord.gg/peBrCpheKE

In this video I do a deep dive into the metaseq (Open Pretrained Transformer, OPT-175B) codebase.

I first show you how to set up the code on your machine, and then I walk you through the codebase behind Meta's large language model (LLM).

Along the way I cover key concepts behind mixed precision training such as loss scaling and unscaling, and much more.

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ …

coding opt-175b series transformer

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Codec Avatars Research Engineer

@ Meta | Pittsburgh, PA