April 21, 2022, 5:05 p.m. | Synced

Synced syncedreview.com

In the new paper GPT-NeoX-20B: An Open-Source Autoregressive Language Model, a research team from Eleuther.AI introduces GPT-NeoX-20B, an open-source 20 billion parameter dense autoregressive language model with powerful few-shot learning capabilities that surpasses similarly sized GPT-3 and FairSeq model performance by a significant margin.


The post Eleuther.AI Introduces GPT-NeoX-20B: The World’s Largest and Most Performant Publicly Available Dense Language Model first appeared on Synced.

ai artificial intelligence autoregressive models gpt language language model machine learning machine learning & data science ml research technology

More from syncedreview.com / Synced

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne