Feb. 7, 2024, 10:51 p.m. | /u/louisbrulenaudet

Machine Learning www.reddit.com

I have just released Pearl-3x7B, a Mixture of Experts (MoE) made with the following models :

* **dvilasuero/DistilabelBeagle14-7B**
* **beowolx/CodeNinja-1.0-OpenChat-7B**
* **WizardLM/WizardMath-7B-V1.1**

Link to Hugging Face : [https://huggingface.co/louisbrulenaudet/Pearl-3x7B](https://huggingface.co/louisbrulenaudet/Pearl-3x7B)

A Mixture of Experts (MoE) model represents a sophisticated architecture that amalgamates the capabilities of multiple specialized models to address a wide array of tasks within a unified framework. Within the realm of a MoE model tailored for a chat application, the integration of expertise spanning three distinct domains - chat, code, …

architecture array capabilities data data science experts machinelearning mixture of experts moe multiple openchat science tasks

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Scientist

@ ITE Management | New York City, United States