all AI news
ESimCSE: Enhanced Sample Building Method for Contrastive Learning of Unsupervised Sentence Embedding. (arXiv:2109.04380v2 [cs.CL] UPDATED)
Sept. 13, 2022, 1:16 a.m. | Xing Wu, Chaochen Gao, Liangjun Zang, Jizhong Han, Zhongyuan Wang, Songlin Hu
cs.CL updates on arXiv.org arxiv.org
Contrastive learning has been attracting much attention for learning
unsupervised sentence embeddings. The current state-of-the-art unsupervised
method is the unsupervised SimCSE (unsup-SimCSE). Unsup-SimCSE takes dropout as
a minimal data augmentation method, and passes the same input sentence to a
pre-trained Transformer encoder (with dropout turned on) twice to obtain the
two corresponding embeddings to build a positive pair. As the length
information of a sentence will generally be encoded into the sentence
embeddings due to the usage of position embedding …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Business Intelligence Developer / Analyst
@ Transamerica | Work From Home, USA
Data Analyst (All Levels)
@ Noblis | Bethesda, MD, United States