all AI news
Model Merging and Mixtures of Experts // Maxime Labonne // AI in Production Conference
March 16, 2024, 8:32 p.m. | MLOps.community
MLOps.community www.youtube.com
Model merging has recently become extremely popular in the open-source community. The idea of merging several fine-tuned models, or combining them into a Mixture of Experts (MoE), led to new state-of-the-art LLMs. This talk introduces the main concepts around model merging and how to implement it using the mergekit library. It provides a notebook to create your own models and directly upload them on the Hugging Face Hub.
// Bio
Maxime Labonne is a seasoned Machine Learning Scientist …
abstract art become community concepts conference experts llms maxime labonne merging mixture of experts moe popular production state talk them
More from www.youtube.com / MLOps.community
Leading Enterprise Data Teams // Sol Rashidi // MLOps Podcast #227
1 day, 13 hours ago |
www.youtube.com
The Rise of Modern Data Management // Chad Sanderson // MLOps Podcast #226
4 days, 10 hours ago |
www.youtube.com
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Software Engineering Manager, Generative AI - Characters
@ Meta | Bellevue, WA | Menlo Park, CA | Seattle, WA | New York City | San Francisco, CA
Senior Operations Research Analyst / Predictive Modeler
@ LinQuest | Colorado Springs, Colorado, United States