all AI news
Mistral's Mixtral 8x22B sets new records for open source LLMs
April 17, 2024, 6:46 p.m. | Matthias Bastian
THE DECODER the-decoder.com
French AI startup Mistral AI has unveiled Mixtral 8x22B, a new open-source language model that the company claims achieves the highest open-source performance and efficiency.
The article Mistral's Mixtral 8x22B sets new records for open source LLMs appeared first on THE DECODER.
ai in practice article artificial intelligence decoder efficiency french french ai startup language language model llms mistral mistral ai mixtral mixtral 8x22b open source open source llms performance records startup the company the decoder
More from the-decoder.com / THE DECODER
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Software Engineer, Machine Learning (Tel Aviv)
@ Meta | Tel Aviv, Israel
Senior Data Scientist- Digital Government
@ Oracle | CASABLANCA, Morocco