all AI news
Mistral tweet a magnet link for mixtral-8x22b
Simon Willison's Weblog simonwillison.net
Mistral tweet a magnet link for mixtral-8x22b
Another open model release from Mistral using their now standard operating procedure of tweeting out a raw torrent link.
This one is an 8x22B Mixture of Experts model. Their previous most powerful openly licensed release was Mixtral 8x7B, so this one is a whole lot bigger (a 281GB download) - and apparently has a 65,536 context length, at least according to initial rumors on Twitter.
ai bigger experts generativeai homebrewllms llms mistral mixtral mixtral 8x7b mixture of experts openly open model raw release standard tweet