Oct. 31, 2022, 1:15 a.m. | Zebin Ou, Meishan Zhang, Yue Zhang

cs.CL updates on arXiv.org arxiv.org

Word ordering is a constrained language generation task taking unordered
words as input. Existing work uses linear models and neural networks for the
task, yet pre-trained language models have not been studied in word ordering,
let alone why they help. We use BART as an instance and show its effectiveness
in the task. To explain why BART helps word ordering, we extend analysis with
probing and empirically identify that syntactic dependency knowledge in BART is
a reliable explanation. We also …

arxiv bart case case study language language models role study

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne