March 4, 2024, 8:10 p.m. | Sam Charrington

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) twimlai.com

Today we’re joined by Akshita Bhagia, a senior research engineer at the Allen Institute for AI. Akshita joins us to discuss OLMo, a new open source language model with 7 billion and 1 billion variants, but with a key difference compared to similar models offered by Meta, Mistral, and others. Namely, the fact that AI2 has also published the dataset and key tools used to train the model. In our chat with Akshita, we dig into the OLMo models and …

allen allen institute allen institute for ai billion difference discuss engineer everything institute joins key language language model llm olmo open source open source language model open source llm research train variants

More from twimlai.com / The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York