July 29, 2022, 5:55 a.m. | /u/Charming_Royal5996

Natural Language Processing www.reddit.com

I was trying to implement a simpler version of word2vec from scratch. While reading up on it, I realised that word2vec implementation does not use any non linearity! Why is it so good at creating embeddings without a non linearity? And isn't non linearity a crucial part of NN? How can a linear function capture complex relationships between words?

languagetechnology linear word2vec

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

DevOps Engineer (Data Team)

@ Reward Gateway | Sofia/Plovdiv