March 5, 2024, 2:42 p.m. | Ameen Ali, Itamar Zimerman, Lior Wolf

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.01590v1 Announce Type: new
Abstract: The Mamba layer offers an efficient selective state space model (SSM) that is highly effective in modeling multiple domains including NLP, long-range sequences processing, and computer vision. Selective SSMs are viewed as dual models, in which one trains in parallel on the entire sequence via IO-aware parallel scan, and deploys in an autoregressive manner. We add a third view and show that such models can be viewed as attention-driven models. This new perspective enables us …

abstract arxiv attention computer computer vision cs.lg domains hidden layer mamba modeling multiple nlp processing space state trains type via vision

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote