all AI news
Task-specific Pre-training and Prompt Decomposition for Knowledge Graph Population with Language Models. (arXiv:2208.12539v2 [cs.CL] UPDATED)
cs.CL updates on arXiv.org arxiv.org
We present a system for knowledge graph population with Language Models,
evaluated on the Knowledge Base Construction from Pre-trained Language Models
(LM-KBC) challenge at ISWC 2022. Our system involves task-specific pre-training
to improve LM representation of the masked object tokens, prompt decomposition
for progressive generation of candidate objects, among other methods for
higher-quality retrieval. Our system is the winner of track 1 of the LM-KBC
challenge, based on BERT LM; it achieves 55.0% F-1 score on the hidden test set …
arxiv graph knowledge knowledge graph language language models population pre-training training