Aug. 6, 2022, 3:55 p.m. | /u/capitano_nemo

Natural Language Processing www.reddit.com

I'm looking into Word2Vec and I don't quite understand why the embeddings don't account for the context. In other words, why would the embedding of _bank_ be the same irrespective of whether we're talking about a financial institution or a river bank?

What puzzles me is the fact that both the skip-gram and cbow approach look at the context of the target word. Why does this not allow to include context information into each embedding?

languagetechnology word2vec

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Senior Applied Data Scientist

@ dunnhumby | London

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV