March 28, 2024, 1:24 a.m. | Kola Ayonrinde

The Gradient

Is Attention all you need? Mamba, a novel AI model based on State Space Models (SSMs), emerges as a formidable alternative to the widely used Transformer models, addressing their inefficiency in processing long sequences.

ai model attention deep learning explained mamba novel novel ai overviews processing reinforcement learning space ssms state state space models transformer transformer models

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Data Analyst


EY GDS Internship Program - Junior Data Visualization Engineer (June - July 2024)

@ EY | Wrocław, DS, PL, 50-086

Staff Data Scientist

@ ServiceTitan | INT Armenia Yerevan

Master thesis on deterministic AI inference on-board Telecom Satellites

@ Airbus | Taufkirchen / Ottobrunn

Lead Data Scientist

@ Picket | Seattle, WA