May 15, 2023, 12:43 a.m. | Hongliang He, Junlei zhang, Zhenzhong Lan, Yue Zhang

cs.LG updates on arXiv.org arxiv.org

Contrastive learning-based methods, such as unsup-SimCSE, have achieved
state-of-the-art (SOTA) performances in learning unsupervised sentence
embeddings. However, in previous studies, each embedding used for contrastive
learning only derived from one sentence instance, and we call these embeddings
instance-level embeddings. In other words, each embedding is regarded as a
unique class of its own, whichmay hurt the generalization performance. In this
study, we propose IS-CSE (instance smoothing contrastive sentence embedding) to
smooth the boundaries of embeddings in the feature space. Specifically, …

art arxiv call embedding embeddings sota state studies unsupervised words

Senior Data Engineer

@ Publicis Groupe | New York City, United States

Associate Principal Robotics Engineer - Research.

@ Dyson | United Kingdom - Hullavington Office

Duales Studium mit vertiefter Praxis: Bachelor of Science Künstliche Intelligenz und Data Science (m/w/d)

@ Gerresheimer | Wackersdorf, Germany

AI/ML Engineer (TS/SCI) {S}

@ ARKA Group, LP | Aurora, Colorado, United States

Data Integration Engineer

@ Find.co | Sliema

Data Engineer

@ Q2 | Bengaluru, India