all AI news
Word embedding - contextualised vs word2vec [D]
April 17, 2024, 1:03 p.m. | /u/datashri
Machine Learning www.reddit.com
As far as I understand so far -
Contextualized word embeddings generated by BERT and other LLM type models use the attention mechanism and take into account the context of the word. So the same word in different sentences can have different vectors.
This ^ is opposed to the older approach of models like word2vec - embeddings generated by word2vec are not contexual.
However, looking closely at the CBOW and skip-gram models. it seems …
attention bert context embedding embeddings generated llm machinelearning question type vectors word word2vec word embedding word embeddings
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US