Jan. 7, 2024, 10:30 p.m. | Venelin Valkov

Venelin Valkov www.youtube.com

Discover Mamba, a neural network architecture challenging the Transformer's dominance in AI. Mamba combines Selective State Spaces and Linear-Time Sequence Modeling and promises unparalleled efficiency and handling of very long sequences with great (linear) speed.

Paper: https://arxiv.org/abs/2312.00752
Mamba Chat: https://github.com/havenhq/mamba-chat

AI Bootcamp (in preview): https://www.mlexpert.io/membership
Discord: https://discord.gg/UaNPxVD6tv
Subscribe: http://bit.ly/venelin-subscribe
GitHub repository: https://github.com/curiousily/Get-Things-Done-with-Prompt-Engineering-and-LangChain

Join this channel to get access to the perks and support my work:
https://www.youtube.com/channel/UCoW_WzQNJVAjxo4osNAxd_g/join

#artificialintelligence #chatgpt #gpt4 #python #chatbot #llama2 #llm

architecture chat code colab efficiency future google join linear llms mamba modeling network network architecture neural network overview paper spaces speed state transformer transformers

More from www.youtube.com / Venelin Valkov

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Engineer - New Graduate

@ Applied Materials | Milan,ITA

Lead Machine Learning Scientist

@ Biogen | Cambridge, MA, United States