Jan. 27, 2024, 2:33 p.m. | 1littlecoder

1littlecoder www.youtube.com

With today’s desktop browser update (v1.62), we are excited to announce that we have integrated Mixtral 8x7B as the default large language model (LLM) in Leo, our recently released, privacy-preserving AI browser assistant. Mixtral 8x7B is an open source LLM released by Mistral AI this past December, and has already seen broad usage due to its speed and performance. In addition, we’ve made several improvements to the Leo user experience, focusing on clearer onboarding, context controls, input and response formatting, …

assistant brave browser desktop language language model large language large language model llm mistral mistral ai mixtral mixtral 8x7b open source open source llm privacy update usage

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineering Manager, Generative AI - Characters

@ Meta | Bellevue, WA | Menlo Park, CA | Seattle, WA | New York City | San Francisco, CA

Senior Operations Research Analyst / Predictive Modeler

@ LinQuest | Colorado Springs, Colorado, United States