June 2, 2022, 1:12 a.m. | Andrea Lekkas, Peter Schneider-Kamp, Isabelle Augenstein

cs.CL updates on arXiv.org arxiv.org

The effectiveness of a language model is influenced by its token
representations, which must encode contextual information and handle the same
word form having a plurality of meanings (polysemy). Currently, none of the
common language modelling architectures explicitly model polysemy. We propose a
language model which not only predicts the next word, but also its sense in
context. We argue that this higher prediction granularity may be useful for end
tasks such as assistive writing, and allow for more a …

arxiv language modelling sense

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Business Intelligence Developer / Analyst

@ Transamerica | Work From Home, USA

Data Analyst (All Levels)

@ Noblis | Bethesda, MD, United States