April 2, 2024, 1:36 p.m. |

Latest stories for ZDNET in Artificial-Intelligence www.zdnet.com

Two new large language models, Jamba and DBRX, dramatically reduce the compute and memory needed for predictions, while meeting or beating the performance of top models such as GPT-3.5 and Llama 2.

ai21 compute databricks dbrx gpt gpt-3 gpt-3.5 jamba language language models large language large language models llama llama 2 memory open source performance predictions reduce show

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York