all AI news
[D] a sentence level transformer to improve memory for a token level transformer?
March 28, 2024, 10:29 a.m. | /u/Alarming-Ad8154
Machine Learning www.reddit.com
You can embed sentences into vectors of length \~128 - \~2048 right? Then you can cluster those sentences and effectively project them into lower dimensional spaces.
I have often wondered whether you could take \~50.000 cardinal points in the embedding space (points such that the summed of squared distance to all sentences in a representative corpus is minimal). You'd then map each sentence in a big corpus to the …
cluster embed machinelearning memory project spaces them token transformer vectors
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Principal Data Engineering Manager
@ Microsoft | Redmond, Washington, United States
Machine Learning Engineer
@ Apple | San Diego, California, United States