March 13, 2024, 4:47 a.m. | Jinta Weng, Yifan Deng, d Donghao Li, Hao You, Yue Hu, Heyan Huang

cs.CL updates on arXiv.org arxiv.org

arXiv:2211.04118v3 Announce Type: replace
Abstract: The prompt has become an effective linguistic tool for utilizing pre-trained language models. However, in few-shot scenarios, subtle changes in the prompt design always make the result widely different, and the prompt learning methods also make it easy to overfit the limited samples. To alleviate this, we explore utilizing suitable contrastive samples and multi-degree contrastive learning methods to improve the robustness of the prompt representation. Therefore, the proposed Consprompt combined with the prompt encoding network, …

abstract arxiv become cs.ai cs.cl design easy few-shot however language language models prompt prompt learning samples the prompt tool type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist

@ Meta | Menlo Park, CA

Principal Data Scientist

@ Mastercard | O'Fallon, Missouri (Main Campus)