March 17, 2024, 7:30 p.m. |

Techmeme www.techmeme.com


x.ai:

xAI open sources the base model weights and network architecture of Grok-1, a 314B parameter Mixture-of-Experts model, under the Apache 2.0 license  —  We are releasing the base model weights and network architecture of Grok-1, our large language model.  Grok-1 is a 314 billion parameter Mixture …

apache apache 2.0 architecture experts grok license network network architecture xai

More from www.techmeme.com / Techmeme

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Data Scientist (Database Development)

@ Nasdaq | Bengaluru-Affluence