Sept. 13, 2022, 1:16 a.m. | Zhi Chen, Yuncong Liu, Lu Chen, Su Zhu, Mengyue Wu, Kai Yu

cs.CL updates on arXiv.org arxiv.org

This paper presents an ontology-aware pretrained language model (OPAL) for
end-to-end task-oriented dialogue (TOD). Unlike chit-chat dialogue models,
task-oriented dialogue models fulfill at least two task-specific modules:
dialogue state tracker (DST) and response generator (RG). The dialogue state
consists of the domain-slot-value triples, which are regarded as the user's
constraints to search the domain-related databases. The large-scale
task-oriented dialogue data with the annotated structured dialogue state
usually are inaccessible. It prevents the development of the pretrained
language model for the …

arxiv language language model ontology pretrained language model

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120