all AI news
Large Model Training and Inference with DeepSpeed // Samyam Rajbhandari // LLMs in Prod Conference
June 29, 2023, 11:53 a.m. | MLOps.community
MLOps.community www.youtube.com
In the last few years, DeepSpeed has released numerous technologies for training and inference of large models, transforming the large model training landscape from a system perspective. Technologies like ZeRO, and 3D-Parallelism have become the building blocks for training large models at scale, powering LLMs like Bloom-176B, Megatron-Turing 530B, and many others. Heterogenous memory training systems like ZeRO-Offload and ZeRO-Infinity have democratized LLMs by making them accessible with limited resources. DeepSpeed-Inference and DeepSpeed-MII have made it easy to apply …
abstract become building conference deepspeed inference landscape large models llms perspective prod scale technologies training
More from www.youtube.com / MLOps.community
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne