all AI news
Word embedding - contextualised vs word2vec [D]
April 17, 2024, 1:03 p.m. | /u/datashri
Machine Learning www.reddit.com
As far as I understand so far -
Contextualized word embeddings generated by BERT and other LLM type models use the attention mechanism and take into account the context of the word. So the same word in different sentences can have different vectors.
This ^ is opposed to the older approach of models like word2vec - embeddings generated by word2vec are not contexual.
However, looking closely at the CBOW and skip-gram models. it seems …
attention bert context embedding embeddings generated llm machinelearning question type vectors word word2vec word embedding word embeddings
More from www.reddit.com / Machine Learning
[R] New Teleoperation Tool with VisionPro
1 day, 4 hours ago |
www.reddit.com
[D] ICML 2024 results
1 day, 9 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Engineer - AWS
@ 3Pillar Global | Costa Rica
Cost Controller/ Data Analyst - India
@ John Cockerill | Mumbai, India, India, India