Oct. 31, 2022, 1:15 a.m. | Zebin Ou, Meishan Zhang, Yue Zhang

cs.CL updates on arXiv.org arxiv.org

Word ordering is a constrained language generation task taking unordered
words as input. Existing work uses linear models and neural networks for the
task, yet pre-trained language models have not been studied in word ordering,
let alone why they help. We use BART as an instance and show its effectiveness
in the task. To explain why BART helps word ordering, we extend analysis with
probing and empirically identify that syntactic dependency knowledge in BART is
a reliable explanation. We also …

arxiv bart case case study language language models role study

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US