all AI news
Introducing Mixtral 8x7B with Databricks Model Serving
Dec. 21, 2023, 5 p.m. |
Databricks www.databricks.com
Today, Databricks is excited to announce support for Mixtral 8x7B in Model Serving. Mixtral 8x7B is a sparse Mixture of Experts (MoE) open...
databricks engineering blog experts mixtral mixtral 8x7b mixture of experts moe platform blog support
More from www.databricks.com / Databricks
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Business Data Analyst
@ Alstom | Johannesburg, GT, ZA