Feb. 16, 2024, 3:07 p.m. | Vyacheslav Efimov

Towards Data Science - Medium towardsdatascience.com

Efficiently scaling GPT from large to titanic magnitudes within the meta-learning framework

Introduction

GPT is a family of language models that has been recently gaining a lot of popularity. The attention of the Data Science community was rapidly captured by the release of GPT-3 in 2020. After the appearance of GPT-2, almost nobody could even assume that nearly in a year there would appear a titanic version of GPT containing 175B of parameters! This is by two orders of …

attention community data data science family few-shot gpt gpt-2 gpt-3 language language models large language large language models machine learning meta meta-learning nlp release scaling science

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120