Web: http://arxiv.org/abs/2206.11309

June 24, 2022, 1:12 a.m. | Baolin Peng, Michel Galley, Pengcheng He, Chris Brockett, Lars Liden, Elnaz Nouri, Zhou Yu, Bill Dolan, Jianfeng Gao

cs.CL updates on arXiv.org arxiv.org

We introduce GODEL (Grounded Open Dialogue Language Model), a large
pre-trained language model for dialog. In contrast with earlier models such as
DialoGPT, GODEL leverages a new phase of grounded pre-training designed to
better support adapting GODEL to a wide range of downstream dialog tasks that
require information external to the current conversation (e.g., a database or
document) to produce good responses. Experiments against an array of benchmarks
that encompass task-oriented dialog, conversational QA, and grounded
open-domain dialog show that …

arxiv pre-training scale training

More from arxiv.org / cs.CL updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY