Aug. 6, 2022, 3:55 p.m. | /u/capitano_nemo

Natural Language Processing www.reddit.com

I'm looking into Word2Vec and I don't quite understand why the embeddings don't account for the context. In other words, why would the embedding of _bank_ be the same irrespective of whether we're talking about a financial institution or a river bank?

What puzzles me is the fact that both the skip-gram and cbow approach look at the context of the target word. Why does this not allow to include context information into each embedding?

languagetechnology word2vec

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Data Engineer

@ Kaseya | Bengaluru, Karnataka, India