all AI news
What are all the 300-dimensional vectors used for google word2vec model?
July 24, 2022, 10:34 p.m. | /u/cronicpainz
Natural Language Processing www.reddit.com
Im new to nlp and just testing waters now with models, frameworks, techniques and related math.
Im looking at this model:
https://code.google.com/archive/p/word2vec/
Can someone explain what are all the "300 embeddings" in
https://code.google.com/archive/p/word2vec/,
model .bin can be found here: https://drive.google.com/file/d/0B7XkCwpI5KDYNlNUTTlSS21pQmM/edit?usp=sharing ?
Here's the example for the word expected:
```
expected 0.06347656 0.265625 0.06982422 -0.008728027 -0.17578125 0.012573242 0.18457031 -0.037841797 0.20898438 0.087890625 -0.064941406 0.125 0.06689
453 -0.18359375 -0.07080078 0.033935547 0.052978516 -0.14257812 0.12597656 0.08544922 -0.33398438 0.018310547 0.043701172 -0.15917969 -0.06542969 -0.1
2060547 -0.013122559 …
More from www.reddit.com / Natural Language Processing
Which NLP-master programs in Europe are more cs-leaning?
3 days, 5 hours ago |
www.reddit.com
What do you think is the state of the art technique for matching a piece …
5 days, 3 hours ago |
www.reddit.com
Multilabel text classification on unlabled data
5 days, 16 hours ago |
www.reddit.com
Did we just receive an AI-generated meta-review?
1 week, 3 days ago |
www.reddit.com
Found a Way to Keep Transcripts Going 24/7
1 week, 3 days ago |
www.reddit.com
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Business Intelligence Architect - Specialist
@ Eastman | Hyderabad, IN, 500 008