May 5, 2022, 1:11 a.m. | Jinhao Jiang, Kun Zhou, Wayne Xin Zhao, Ji-Rong Wen

cs.CL updates on arXiv.org arxiv.org

Commonsense reasoning in natural language is a desired ability of artificial
intelligent systems. For solving complex commonsense reasoning tasks, a typical
solution is to enhance pre-trained language models~(PTMs) with a
knowledge-aware graph neural network~(GNN) encoder that models a commonsense
knowledge graph~(CSKG). Despite the effectiveness, these approaches are built
on heavy architectures, and can't clearly explain how external knowledge
resources improve the reasoning capacity of PTMs. Considering this issue, we
conduct a deep empirical analysis, and find that it is indeed …

arxiv capacity encoder knowledge pre-trained models reasoning

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Machine Learning Operations (MLOps) Engineer - Advisor

@ Peraton | Fort Lewis, WA, United States

Mid +/Senior Data Engineer (AWS/GCP)

@ Capco | Poland

Senior Software Engineer (ETL and Azure Databricks)|| RR/463/2024 || 4 - 7 Years

@ Emids | Bengaluru, India

Senior Data Scientist (H/F)

@ Business & Decision | Toulouse, France

Senior Analytics Engineer

@ Algolia | Paris, France