Feb. 16, 2024, noon | Super Data Science: ML & AI Podcast with Jon Krohn

Super Data Science Podcast with Jon Krohn www.youtube.com

Jon Krohn explores the groundbreaking Mamba architecture, the latest challenger to the Transformer model's dominance in the AI landscape, going through the key insights from the "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" paper, authored by leading researchers at Carnegie Mellon and Princeton. Learn why the tech community is buzzing about Mamba's ability to efficiently handle long input sequences - a task that has traditionally bogged down Transformer models with computational inefficiencies. Jon also explores how Mamba's selective information …

architecture carnegie mellon challenger groundbreaking insights jon key landscape learn linear llms mamba modeling paper researchers spaces state the key through transformer transformer model transformers

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US