Jan. 14, 2024, 12:17 p.m. | Tanya Malhotra

MarkTechPost www.marktechpost.com

In recent research, a team of researchers from Mistral AI has presented Mixtral 8x7B, a language model based on the new Sparse Mixture of Experts (SMoE) model with open weights. Licensed under the Apache 2.0 license and as a sparse network of a mixture of experts, Mixtral serves just as a decoder model. The team […]


The post Mistral AI Introduces Mixtral 8x7B: a Sparse Mixture of Experts (SMoE) Language Model Transforming Machine Learning appeared first on MarkTechPost.

ai shorts apache apache 2.0 applications artificial intelligence editors pick experts language language model large language model license machine machine learning mistral mistral ai mixtral mixtral 8x7b mixture of experts network research researchers staff team tech news technology

More from www.marktechpost.com / MarkTechPost

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote