all AI news
Which type word embedding ( as in BERT, word2vec, Glove etc) does spacy use by default?
Nov. 6, 2022, 1:38 a.m. | /u/ethiopianboson
Natural Language Processing www.reddit.com
​
​
nlp = spacy.load("en\_core\_web\_md") **with** open ("data/wiki\_us.txt", "r") **as** f: text = f.read() doc = nlp(text) sentence1 = list(doc.sents)\[0\]
your\_word = "dog" ms = nlp.vocab.vectors.most\_similar( np.asarray(\[nlp.vocab.vectors\[nlp.vocab.strings\[your\_word\]\]\]), n=10) words = \[nlp.vocab.strings\[w\] **for** w **in** ms\[0\]\[0\]\] distances = ms\[2\] print(words)
​
*# Similarity of tokens and …
More from www.reddit.com / Natural Language Processing
Which NLP-master programs in Europe are more cs-leaning?
2 days, 20 hours ago |
www.reddit.com
What do you think is the state of the art technique for matching a piece …
4 days, 18 hours ago |
www.reddit.com
Multilabel text classification on unlabled data
5 days, 7 hours ago |
www.reddit.com
Did we just receive an AI-generated meta-review?
1 week, 3 days ago |
www.reddit.com
Found a Way to Keep Transcripts Going 24/7
1 week, 3 days ago |
www.reddit.com
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
DevOps Engineer (Data Team)
@ Reward Gateway | Sofia/Plovdiv