March 24, 2024, 1 p.m. | Matthew S. Smith

IEEE Spectrum spectrum.ieee.org



Last weekend, X.ai released the world’s largest “open-source” large language model (LLM), Grok-1. At 314 billion parameters, it’s the largest open-source model yet, far exceeding predecessors like Falcon 180B at 180 billion.

That’s impressive, but the timing of X.ai’s release—just a few weeks after Elon Musk, founder of X.ai, filed suit against OpenAI for an alleged lack of openness—raised eyebrows. Is Grok-1 a useful effort to move open-source AI forward, or more of a ploy to prove Musk’s company …

advance billion elon elon musk falcon falcon 180b founder gpu grok hype language language model large language large language model large language models llm meta musk openai open source parameters release social media troll world

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior ML Engineer

@ Carousell Group | Ho Chi Minh City, Vietnam

Data and Insight Analyst

@ Cotiviti | Remote, United States