Web: http://arxiv.org/abs/2205.01841

May 5, 2022, 1:11 a.m. | Jinhao Jiang, Kun Zhou, Wayne Xin Zhao, Ji-Rong Wen

cs.CL updates on arXiv.org arxiv.org

Commonsense reasoning in natural language is a desired ability of artificial
intelligent systems. For solving complex commonsense reasoning tasks, a typical
solution is to enhance pre-trained language models~(PTMs) with a
knowledge-aware graph neural network~(GNN) encoder that models a commonsense
knowledge graph~(CSKG). Despite the effectiveness, these approaches are built
on heavy architectures, and can't clearly explain how external knowledge
resources improve the reasoning capacity of PTMs. Considering this issue, we
conduct a deep empirical analysis, and find that it is indeed …

arxiv capacity encoder knowledge models reasoning

More from arxiv.org / cs.CL updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California