all AI news
Large Language Models, GPT-2 — Language Models are Unsupervised Multitask Learners
Feb. 10, 2024, 4:42 p.m. | Vyacheslav Efimov
Towards Data Science - Medium towardsdatascience.com
Large Language Models, GPT-2 — Language Models Are Unsupervised Multitask Learners
Acing GPT capabilities by turning it into a powerful multitask zero-shot model
Introduction
GPT is a well-known series of models whose last versions are currently dominating in various NLP tasks. The first GPT version was a significant milestone: being trained on enormous 120M parameters, this model demonstrated state-of-the-art performance on top benchmarks. Starting from this point, researchers tried to improve the base version.
In 2019, researchers from OpenAI officially …
capabilities gpt gpt-2 language language models large language large language models machine learning nlp series tasks transformers unsupervised versions zero-shot
More from towardsdatascience.com / Towards Data Science - Medium
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Senior Applied Data Scientist
@ dunnhumby | London
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV