March 24, 2022, 11:40 a.m. | /u/juliensalinas

Natural Language Processing www.reddit.com

Hello all,

After 1 year working extensively with GPT models (GPT-3, GPT-J, and GPT-NeoX), I think I now have a good view on what these NLP models are capable of. It appears that many traditional NLP tasks can now be achieved thanks to these large language models thanks to few-shot learning (aka "prompting", or "prompt engineering").

NER is a very good candidate because, thanks to these models, it is possible to extract any type of entity without ever annotating and …

annotation extraction gpt languagetechnology training

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Associate

@ EcoVadis | Ebène, Mauritius

Senior Data Engineer

@ Telstra | Telstra ICC Bengaluru