March 24, 2024, 1 p.m. | Matthew S. Smith

IEEE Spectrum spectrum.ieee.org



Last weekend, X.ai released the world’s largest “open-source” large language model (LLM), Grok-1. At 314 billion parameters, it’s the largest open-source model yet, far exceeding predecessors like Falcon 180B at 180 billion.

That’s impressive, but the timing of X.ai’s release—just a few weeks after Elon Musk, founder of X.ai, filed suit against OpenAI for an alleged lack of openness—raised eyebrows. Is Grok-1 a useful effort to move open-source AI forward, or more of a ploy to prove Musk’s company …

advance billion elon elon musk falcon falcon 180b founder gpu grok hype language language model large language large language model large language models llm meta musk openai open source parameters release social media troll world

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US