Nov. 10, 2022, 2:15 a.m. | Yujie Xing, Jon Atle Gulla

cs.CL updates on arXiv.org arxiv.org

Despite the rapid progress of open-domain generation-based conversational
agents, most deployed systems treat dialogue contexts as single-turns, while
systems dealing with multi-turn contexts are less studied. There is a lack of a
reliable metric for evaluating multi-turn modelling, as well as an effective
solution for improving it. In this paper, we focus on an essential component of
multi-turn generation-based conversational agents: context attention
distribution, i.e. how systems distribute their attention on dialogue's
context. For evaluation of this component, We introduce …

arxiv attention context distractions distribution

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US