March 29, 2024, 12:34 a.m. | Mike Wheatley

AI – SiliconANGLE siliconangle.com


Generative artificial intelligence startup AI21 Labs Ltd., a rival to OpenAI, has unveiled what it says is a groundbreaking new AI model called Jamba that goes beyond the traditional transformer-based architecture that underpins the most powerful large language models today. Announced today, Jamba combines the transformers architecture with Mamba, an older model based on Structured […]

The post AI21 Labs’ Jamba infuses Mamba to bring more context to transformer-based LLMs appeared first on SiliconANGLE.

ai ai21 ai21 labs ai model architecture artificial artificial intelligence beyond context context window generative generative-ai generative artificial intelligence groundbreaking intelligence jamba labs language language models large language large language models llms mamba openai startup structured state space the-latest transformer transformers

More from siliconangle.com / AI – SiliconANGLE

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

MLOps Engineer - Hybrid Intelligence

@ Capgemini | Madrid, M, ES

Analista de Business Intelligence (Industry Insights)

@ NielsenIQ | Cotia, Brazil