all AI news
Mixtral 8x22B sets new benchmark for open models
Mistral AI has released Mixtral 8x22B, which sets a new benchmark for open source models in performance and efficiency. The model boasts robust multilingual capabilities and superior mathematical and coding prowess. Mixtral 8x22B operates as a Sparse Mixture-of-Experts (SMoE) model, utilising just 39 billion of its 141 billion parameters when active. Beyond its efficiency, the... Read more »
The post Mixtral 8x22B sets new benchmark for open models appeared first on AI News.
8x22b ai artificial intelligence benchmark beyond billion capabilities coding companies development efficiency experts mistral mistral ai mixtral mixtral 8x22b multilingual open models open source parameters performance robust