Feb. 16, 2024, noon | Super Data Science: ML & AI Podcast with Jon Krohn

Super Data Science Podcast with Jon Krohn www.youtube.com

Jon Krohn explores the groundbreaking Mamba architecture, the latest challenger to the Transformer model's dominance in the AI landscape, going through the key insights from the "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" paper, authored by leading researchers at Carnegie Mellon and Princeton. Learn why the tech community is buzzing about Mamba's ability to efficiently handle long input sequences - a task that has traditionally bogged down Transformer models with computational inefficiencies. Jon also explores how Mamba's selective information …

architecture carnegie mellon challenger groundbreaking insights jon key landscape learn linear llms mamba modeling paper researchers spaces state the key through transformer transformer model transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Science Analyst- ML/DL/LLM

@ Mayo Clinic | Jacksonville, FL, United States

Machine Learning Research Scientist, Robustness and Uncertainty

@ Nuro, Inc. | Mountain View, California (HQ)