April 26, 2024, 10 a.m. | Editorial Team

insideBIGDATA insidebigdata.com

AI21, a leader in AI systems for the enterprise, unveiled Jamba, the production-grade Mamba-style model – integrating Mamba Structured State Space model (SSM) technology with elements of traditional Transformer architecture. Jamba marks a significant advancement in large language model (LLM) development, offering unparalleled efficiency, throughput, and performance.

advancement ai21 ai deep learning ai systems analysis architecture data science development efficiency enterprise genai generative-ai google news feed groundbreaking hybrid jamba language language model large language large language model leader llm machine learning main feature mamba marks performance production reports research space ssm state state space model structured state space style systems technology transformer transformer architecture weekly newsletter articles

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Alternance DATA/AI Engineer (H/F)

@ SQLI | Le Grand-Quevilly, France