April 2, 2024, 7:41 p.m. | Lecheng Zheng, Baoyu Jing, Zihao Li, Hanghang Tong, Jingrui He

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.00225v1 Announce Type: new
Abstract: In the era of big data and Artificial Intelligence, an emerging paradigm is to utilize contrastive self-supervised learning to model large-scale heterogeneous data. Many existing foundation models benefit from the generalization capability of contrastive self-supervised learning by learning compact and high-quality representations without relying on any label information. Amidst the explosive advancements in foundation models across multiple domains, including natural language processing and computer vision, a thorough survey on heterogeneous contrastive learning for the foundation …

abstract artificial artificial intelligence arxiv benefit beyond big big data capability compact cs.lg data foundation intelligence paradigm quality scale self-supervised learning supervised learning type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain