March 26, 2024, 4:51 a.m. | Zhenpeng Su, Xing Wu, Wei Zhou, Guangyuan Ma, Songlin Hu

cs.CL updates on arXiv.org arxiv.org

arXiv:2306.04357v4 Announce Type: replace
Abstract: Dialogue response selection aims to select an appropriate response from several candidates based on a given user and system utterance history. Most existing works primarily focus on post-training and fine-tuning tailored for cross-encoders. However, there are no post-training methods tailored for dense encoders in dialogue response selection. We argue that when the current language model, based on dense dialogue systems (such as BERT), is employed as a dense encoder, it separately encodes dialogue context and …

abstract arxiv auto cs.ai cs.cl dialogue encoder fine-tuning focus history however retrieval systems training type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist, Demography and Survey Science, University Grad

@ Meta | Menlo Park, CA | New York City

Computer Vision Engineer, XR

@ Meta | Burlingame, CA