all AI news
Instance Smoothed Contrastive Learning for Unsupervised Sentence Embedding. (arXiv:2305.07424v1 [cs.CL])
cs.LG updates on arXiv.org arxiv.org
Contrastive learning-based methods, such as unsup-SimCSE, have achieved
state-of-the-art (SOTA) performances in learning unsupervised sentence
embeddings. However, in previous studies, each embedding used for contrastive
learning only derived from one sentence instance, and we call these embeddings
instance-level embeddings. In other words, each embedding is regarded as a
unique class of its own, whichmay hurt the generalization performance. In this
study, we propose IS-CSE (instance smoothing contrastive sentence embedding) to
smooth the boundaries of embeddings in the feature space. Specifically, …
art arxiv call embedding embeddings sota state studies unsupervised words