s
April 10, 2024, 2:31 a.m. |

Simon Willison's Weblog simonwillison.net

Mistral tweet a magnet link for mixtral-8x22b


Another open model release from Mistral using their now standard operating procedure of tweeting out a raw torrent link.


This one is an 8x22B Mixture of Experts model. Their previous most powerful openly licensed release was Mixtral 8x7B, so this one is a whole lot bigger (a 281GB download) - and apparently has a 65,536 context length, at least according to initial rumors on Twitter.

ai bigger experts generativeai homebrewllms llms mistral mixtral mixtral 8x7b mixture of experts openly open model raw release standard tweet

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Data Scientist, gTech Ads

@ Google | Mexico City, CDMX, Mexico

Lead, Data Analytics Operations

@ Zocdoc | Pune, Maharashtra, India