all AI news
Slot Induction via Pre-trained Language Model Probing and Multi-level Contrastive Learning. (arXiv:2308.04712v1 [cs.CL])
cs.LG updates on arXiv.org arxiv.org
Recent advanced methods in Natural Language Understanding for Task-oriented
Dialogue (TOD) Systems (e.g., intent detection and slot filling) require a
large amount of annotated data to achieve competitive performance. In reality,
token-level annotations (slot labels) are time-consuming and difficult to
acquire. In this work, we study the Slot Induction (SI) task whose objective is
to induce slot boundaries without explicit knowledge of token-level slot
annotations. We propose leveraging Unsupervised Pre-trained Language Model
(PLM) Probing and Contrastive Learning mechanism to exploit …
advanced annotated data annotations arxiv data detection dialogue intent detection labels language language model language understanding natural natural language performance reality study systems token understanding work