all AI news
An Information Minimization Based Contrastive Learning Model for Unsupervised Sentence Embeddings Learning. (arXiv:2209.10951v1 [cs.CL])
Sept. 23, 2022, 1:15 a.m. | Shaobin Chen, Jie Zhou, Yuling Sun, Liang He
cs.CL updates on arXiv.org arxiv.org
Unsupervised sentence embeddings learning has been recently dominated by
contrastive learning methods (e.g., SimCSE), which keep positive pairs similar
and push negative pairs apart. The contrast operation aims to keep as much
information as possible by maximizing the mutual information between positive
instances, which leads to redundant information in sentence embedding. To
address this problem, we present an information minimization based contrastive
learning (InforMin-CL) model to retain the useful information and discard the
redundant information by maximizing the mutual information …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Strategy & Management - Private Equity Sector - Manager - Consulting - Location OPEN
@ EY | New York City, US, 10001-8604
Data Engineer- People Analytics
@ Volvo Group | Gothenburg, SE, 40531