s
April 10, 2024, 2:31 a.m. |

Simon Willison's Weblog simonwillison.net

Mistral tweet a magnet link for mixtral-8x22b


Another open model release from Mistral using their now standard operating procedure of tweeting out a raw torrent link.


This one is an 8x22B Mixture of Experts model. Their previous most powerful openly licensed release was Mixtral 8x7B, so this one is a whole lot bigger (a 281GB download) - and apparently has a 65,536 context length, at least according to initial rumors on Twitter.

ai bigger experts generativeai homebrewllms llms mistral mixtral mixtral 8x7b mixture of experts openly open model raw release standard tweet

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York