April 18, 2024, 2:39 p.m. | Ryan Daws

AI News www.artificialintelligence-news.com

Mistral AI has released Mixtral 8x22B, which sets a new benchmark for open source models in performance and efficiency. The model boasts robust multilingual capabilities and superior mathematical and coding prowess. Mixtral 8x22B operates as a Sparse Mixture-of-Experts (SMoE) model, utilising just 39 billion of its 141 billion parameters when active. Beyond its efficiency, the... Read more »


The post Mixtral 8x22B sets new benchmark for open models appeared first on AI News.

8x22b ai artificial intelligence benchmark beyond billion capabilities coding companies development efficiency experts mistral mistral ai mixtral mixtral 8x22b multilingual open models open source parameters performance robust

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineer, Data Tools - Full Stack

@ DoorDash | Pune, India

Senior Data Analyst

@ Artsy | New York City