April 18, 2024, 4:45 a.m. | Siyuan Li, Zicheng Liu, Zelin Zang, Di Wu, Zhiyuan Chen, Stan Z. Li

cs.CV updates on arXiv.org arxiv.org

arXiv:2110.14553v4 Announce Type: replace-cross
Abstract: Unsupervised representation learning (URL), which learns compact embeddings of high-dimensional data without supervision, has made remarkable progress recently. However, the development of URLs for different requirements is independent, which limits the generalization of the algorithms, especially prohibitive as the number of tasks grows. For example, dimension reduction methods, t-SNE, and UMAP optimize pair-wise data relationships by preserving the global geometric structure, while self-supervised learning, SimCLR, and BYOL focus on mining the local statistics of instances …

abstract algorithms arxiv compact cs.ai cs.cv cs.lg data development embeddings example framework general however independent progress representation representation learning requirements supervision tasks type unsupervised url urls

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US