May 16, 2022, 1:11 a.m. | Ehsan Qasemi, Filip Ilievski, Muhao Chen, Pedro Szekely

cs.CL updates on arXiv.org arxiv.org

Humans can seamlessly reason with circumstantial preconditions of commonsense
knowledge. We understand that a glass is used for drinking water, unless the
glass is broken or the water is toxic. Despite state-of-the-art (SOTA) language
models' (LMs) impressive performance on inferring commonsense knowledge, it is
unclear whether they understand the circumstantial preconditions. To address
this gap, we propose a novel challenge of reasoning with circumstantial
preconditions. We collect a dataset, called PaCo, consisting of 12.4 thousand
preconditions of commonsense statements expressed …

arxiv knowledge

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Engineer

@ Parker | New York City

Sr. Data Analyst | Home Solutions

@ Three Ships | Raleigh or Charlotte, NC