Dec. 12, 2023, 9:45 a.m. | code_your_own_AI

code_your_own_AI www.youtube.com

Mistral AI's new Sparse Mixture-of-Experts system (SMoE) is now available: MIXTRAL 8x7B. Also in a DPO instruction tuned version MIXTRAL 8x 7B Instruct, which we test on real world causal reasoning, in a live recoding.

Plus Python code to run inference on the fp32 Mixtral 8x7B, on the fp16 Mixtral 8x7B, on the 4-bit quantized version of Mixtral 8x7B and also w/ Flash-attention_2.

Plus costs for inference API (costs per token) and embedding API.

00:00 Live test of Mixtral 8x7B …

code experts fp16 inference mistral mistral ai mixtral 8x7b moe performance python reasoning test world

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Senior Applied Data Scientist

@ dunnhumby | London

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV