Feb. 24, 2024, 7:11 a.m. | /u/Alarmed-Profile5736

Machine Learning www.reddit.com

Hello!

I've been pondering this question for some time. To clarify, I'm not referring to aspects like "it hasn't been tested extensively," "its scalability is uncertain," or "there's a lack of industry infrastructure." Instead, I'm interested in understanding the core differences between the transformer and Mamba architectures, specifically how these differences may place Mamba at a disadvantage compared to Transformers.

Best regards!

architectures core differences hello industry infrastructure machinelearning mamba question scalability transformer transformers uncertain understanding

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Robotics Technician - 3rd Shift

@ GXO Logistics | Perris, CA, US, 92571