all AI news
On the Role of Bidirectionality in Language Model Pre-Training. (arXiv:2205.11726v1 [cs.CL])
May 25, 2022, 1:11 a.m. | Mikel Artetxe, Jingfei Du, Naman Goyal, Luke Zettlemoyer, Ves Stoyanov
cs.CL updates on arXiv.org arxiv.org
Prior work on language model pre-training has explored different
architectures and learning objectives, but differences in data, hyperparameters
and evaluation make a principled comparison difficult. In this work, we focus
on bidirectionality as a key factor that differentiates existing approaches,
and present a comprehensive study of its role in next token prediction, text
infilling, zero-shot priming and fine-tuning. We propose a new framework that
generalizes prior approaches, including fully unidirectional models like GPT,
fully bidirectional models like BERT, and hybrid …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Management Assistant
@ World Vision | Amman Office, Jordan
Cloud Data Engineer, Global Services Delivery, Google Cloud
@ Google | Buenos Aires, Argentina