March 1, 2024, 5:49 a.m. | Bowen Zhang, Kehua Chang, Chunping Li

cs.CL updates on arXiv.org arxiv.org

arXiv:2309.11143v2 Announce Type: replace
Abstract: Unsupervised sentence representation learning aims to transform input sentences into fixed-length vectors enriched with intricate semantic information while obviating the reliance on labeled data. Recent progress within this field, propelled by contrastive learning and prompt engineering, has significantly bridged the gap between unsupervised and supervised strategies. Nonetheless, the potential utilization of Chain-of-Thought, remains largely untapped in this trajectory. To unlock the latent capabilities of pre-trained models, such as BERT, we propose a two-stage approach for …

abstract arxiv bert cs.ai cs.cl data engineering gap information progress prompt reliance representation representation learning semantic strategies thought through type unsupervised vectors

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Intern Large Language Models Planning (f/m/x)

@ BMW Group | Munich, DE

Data Engineer Analytics

@ Meta | Menlo Park, CA | Remote, US