March 18, 2024, 4:36 a.m. | /u/Excellent-Target-847

Artificial Intelligence www.reddit.com

1. **Elon Musk**’s **xAI** has open-sourced the base code of **Grok** AI model, but without any training code. The company described it as the “314 billion parameter Mixture-of-Expert model” on GitHub.\[1\]
2. **Apple** Announces **MM1**: A Family of Multimodal LLMs Up To 30B Parameters that are SoTA in Pre-Training Metrics and Perform Competitively after Fine-Tuning.\[2\]
3. **Microsoft** tells European regulators **Google** has an edge in generative AI.\[3\]
4. **Nvidia’s** Jensen Huang, Fed’s Powell may rock markets this week.\[4\]

Sources:

\[1\] …

ai model ai news apple artificial billion code daily elon elon musk expert family fine-tuning github grok llms metrics multimodal multimodal llms musk parameters pre-training sota the company training xai

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne