all AI news
Contrastive learning unifies $t$-SNE and UMAP. (arXiv:2206.01816v1 [cs.LG])
June 7, 2022, 1:10 a.m. | Sebastian Damrich (1), Jan Niklas Böhm (2), Fred A. Hamprecht (1), Dmitry Kobak (2) ((1) IWR at Heidelberg University, (2) University of Tüb
cs.LG updates on arXiv.org arxiv.org
Neighbor embedding methods $t$-SNE and UMAP are the de facto standard for
visualizing high-dimensional datasets. They appear to use very different loss
functions with different motivations, and the exact relationship between them
has been unclear. Here we show that UMAP is effectively negative sampling
applied to the $t$-SNE loss function. We explain the difference between
negative sampling and noise-contrastive estimation (NCE), which has been used
to optimize $t$-SNE under the name NCVis. We prove that, unlike NCE, negative
sampling learns …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote