April 5, 2022, 1 p.m. | Anthony Alford

InfoQ - AI, ML & Data Engineering www.infoq.com

Researchers from EleutherAI have open-sourced GPT-NeoX-20B, a 20-billion parameter natural language processing (NLP) AI model similar to GPT-3. The model was trained on 825GB of publicly available text data and has performance comparable to similarly-sized GPT-3 models.

By Anthony Alford

ai deep learning eleutherai gpt language language model ml & data engineering natural language processing neural networks news

More from www.infoq.com / InfoQ - AI, ML & Data Engineering

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Modeler

@ Sherwin-Williams | Cleveland, OH, United States