April 18, 2024, 2:39 p.m. | Ryan Daws

AI News www.artificialintelligence-news.com

Mistral AI has released Mixtral 8x22B, which sets a new benchmark for open source models in performance and efficiency. The model boasts robust multilingual capabilities and superior mathematical and coding prowess. Mixtral 8x22B operates as a Sparse Mixture-of-Experts (SMoE) model, utilising just 39 billion of its 141 billion parameters when active. Beyond its efficiency, the... Read more »


The post Mixtral 8x22B sets new benchmark for open models appeared first on AI News.

8x22b ai artificial intelligence benchmark beyond billion capabilities coding companies development efficiency experts mistral mistral ai mixtral mixtral 8x22b multilingual open models open source parameters performance robust

More from www.artificialintelligence-news.com / AI News

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US