Feb. 7, 2024, 10:51 p.m. | /u/louisbrulenaudet

Machine Learning www.reddit.com

I have just released Pearl-3x7B, a Mixture of Experts (MoE) made with the following models :

* **dvilasuero/DistilabelBeagle14-7B**
* **beowolx/CodeNinja-1.0-OpenChat-7B**
* **WizardLM/WizardMath-7B-V1.1**

Link to Hugging Face : [https://huggingface.co/louisbrulenaudet/Pearl-3x7B](https://huggingface.co/louisbrulenaudet/Pearl-3x7B)

A Mixture of Experts (MoE) model represents a sophisticated architecture that amalgamates the capabilities of multiple specialized models to address a wide array of tasks within a unified framework. Within the realm of a MoE model tailored for a chat application, the integration of expertise spanning three distinct domains - chat, code, …

architecture array capabilities data data science experts machinelearning mixture of experts moe multiple openchat science tasks

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Research Scholar (Technical Research)

@ Centre for the Governance of AI | Hybrid; Oxford, UK

Lead Software Engineer, Machine Learning

@ Monarch Money | Remote (US)

Investigator, Data Science

@ GSK | Stevenage

Alternance - Assistant.e Chef de Projet Data Business Intelligence (H/F)

@ Pernod Ricard | FR - Paris - The Island

Expert produit Big Data & Data Science - Services Publics - Nantes

@ Sopra Steria | Nantes, France