all AI news
Google Trains 280 Billion Parameter AI Language Model Gopher
Jan. 4, 2022, 2 p.m. | Anthony Alford
InfoQ - AI, ML & Data Engineering www.infoq.com
Google subsidiary DeepMind announced Gopher, a 280-billion-parameter AI natural language processing (NLP) model. Based on the Transformer architecture and trained on a 10.5TB corpus called MassiveText, Gopher outperformed the current state-of-the-art on 100 of 124 evaluation tasks.
By Anthony Alfordai deep learning google google deepmind language language model ml & data engineering natural language processing news
More from www.infoq.com / InfoQ - AI, ML & Data Engineering
Jobs in AI, ML, Big Data
Data Scientist (m/f/x/d)
@ Symanto Research GmbH & Co. KG | Spain, Germany
Data Engineer
@ Bosch Group | San Luis Potosí, Mexico
DATA Engineer (H/F)
@ Renault Group | FR REN RSAS - Le Plessis-Robinson (Siège)
Advisor, Data engineering
@ Desjardins | 1, Complexe Desjardins, Montréal
Data Engineer Intern
@ Getinge | Wayne, NJ, US
Software Engineer III- Java / Python / Pyspark / ETL
@ JPMorgan Chase & Co. | Jersey City, NJ, United States