all AI news
Mixtral 8x22B: AI startup Mistral releases new open language model
April 10, 2024, 3:34 p.m. | Maximilian Schreiner
THE DECODER the-decoder.com
Paris-based AI startup Mistral has released Mixtral-8x22B MoE, a new open language model, via a torrent link.
The article Mixtral 8x22B: AI startup Mistral releases new open language model appeared first on THE DECODER.
ai in practice article artificial intelligence decoder generative-ai language language model mistral mixtral moe paris releases startup the decoder via
More from the-decoder.com / THE DECODER
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst (Digital Business Analyst)
@ Activate Interactive Pte Ltd | Singapore, Central Singapore, Singapore