Jan. 31, 2024, 9:25 p.m. | 1littlecoder

1littlecoder www.youtube.com

"An over-enthusiastic employee of one of our early access customers leaked a quantised (and watermarked) version of an old model we trained and distributed quite openly.

To quickly start working with a few selected customers, we retrained this model from Llama 2 the minute we got access to our entire cluster — the pretraining finished on the day of Mistral 7B release." - Mistral CEO Arthur Mensch confirms model leak

🔗 Links 🔗

https://twitter.com/arthurmensch/status/1752737462663684344

❤️ If you want to support …

ceo cluster customers distributed employee leak leaked llama llama 2 mistral openly pretraining

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne