March 4, 2024, 8:10 p.m. | Sam Charrington

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) twimlai.com

Today we’re joined by Akshita Bhagia, a senior research engineer at the Allen Institute for AI. Akshita joins us to discuss OLMo, a new open source language model with 7 billion and 1 billion variants, but with a key difference compared to similar models offered by Meta, Mistral, and others. Namely, the fact that AI2 has also published the dataset and key tools used to train the model. In our chat with Akshita, we dig into the OLMo models and …

allen allen institute allen institute for ai billion difference discuss engineer everything institute joins key language language model llm olmo open source open source language model open source llm research train variants

More from twimlai.com / The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist, Demography and Survey Science, University Grad

@ Meta | Menlo Park, CA | New York City

Computer Vision Engineer, XR

@ Meta | Burlingame, CA