March 11, 2024, 4:47 a.m. | Francesco Periti, Haim Dubossarsky, Nina Tahmasebi

cs.CL updates on arXiv.org arxiv.org

arXiv:2401.14040v2 Announce Type: replace
Abstract: In the universe of Natural Language Processing, Transformer-based language models like BERT and (Chat)GPT have emerged as lexical superheroes with great power to solve open research problems. In this paper, we specifically focus on the temporal problem of semantic change, and evaluate their ability to solve two diachronic extensions of the Word-in-Context (WiC) task: TempoWiC and HistoWiC. In particular, we investigate the potential of a novel, off-the-shelf technology like ChatGPT (and GPT) 3.5 compared to …

abstract arxiv bert change chat cs.cl detection focus gpt justice language language models language processing natural natural language natural language processing paper power processing research semantic solve temporal the universe transformer type universe

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120