all AI news
On the Role of Pre-trained Language Models in Word Ordering: A Case Study with BART. (arXiv:2204.07367v1 [cs.CL])
April 18, 2022, 1:10 a.m. | Zebin Ou, Meishan Zhang, Yue Zhang
cs.CL updates on arXiv.org arxiv.org
Word ordering is a constrained language generation task taking unordered
words as input. Existing work uses linear models and neural networks for the
task, yet pre-trained language models have not been studied in word ordering,
let alone why they help. We use BART as an instance and show its effectiveness
in the task. To explain why BART helps word ordering, we extend analysis with
probing and empirically identify that syntactic dependency knowledge in BART is
a reliable explanation. We also …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne