March 27, 2024, 2 p.m. | code_your_own_AI

code_your_own_AI www.youtube.com

DBRX: NEW MOST POWERFUL Open Source LLM: MoE 132B 16E 32K 12T

Databricks reveals DBRX: After 2 months of cloud compute on 3072 H100 GPUs DBRX sets a new state-of-the-art for established open LLMs. Moreover, it provides the open community and enterprises building their own LLMs with capabilities that were previously limited to closed model APIs; according to first measurements, it surpasses GPT-3.5, and it is competitive with Gemini 1.0 Pro.

Great job @Databricks

All rights with authors:
https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm

#databricks …

art building capabilities cloud community compute databricks enterprises gpus h100 llm llms moe open source open source llm state

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist

@ Meta | Menlo Park, CA

Principal Data Scientist

@ Mastercard | O'Fallon, Missouri (Main Campus)