Web: http://arxiv.org/abs/2104.08741

Jan. 31, 2022, 2:10 a.m. | Keshav Kolluru, Mayank Singh Chauhan, Yatin Nandwani, Parag Singla, Mausam

cs.CL updates on arXiv.org arxiv.org

Pre-trained language models (LMs) like BERT have shown to store factual
knowledge about the world. This knowledge can be used to augment the
information present in Knowledge Bases, which tend to be incomplete. However,
prior attempts at using BERT for task of Knowledge Base Completion (KBC)
resulted in performance worse than embedding based techniques that rely only on
the graph structure. In this work we develop a novel model, Cross-Entity Aware
Reranker (CEAR), that uses BERT to re-rank the output …

arxiv cross knowledge

More from arxiv.org / cs.CL updates on arXiv.org

Director, Data Engineering and Architecture

@ Chainalysis | California | New York | Washington DC | Remote - USA

Deep Learning Researcher

@ Topaz Labs | Dallas, TX

Sr Data Engineer (Contractor)

@ SADA | US - West

Senior Cloud Database Administrator

@ Findhelp | Remote

Senior Data Analyst

@ System1 | Remote

Speech Machine Learning Research Engineer

@ Samsung Research America | Mountain View, CA