Feb. 20, 2024, 5:52 a.m. | Kai Wang, Yuwei Xu, Zhiyong Wu, Siqiang Luo

cs.CL updates on arXiv.org arxiv.org

arXiv:2402.11804v1 Announce Type: cross
Abstract: Knowledge Graph (KG) inductive reasoning, which aims to infer missing facts from new KGs that are not seen during training, has been widely adopted in various applications. One critical challenge of KG inductive reasoning is handling low-resource scenarios with scarcity in both textual and structural aspects. In this paper, we attempt to address this challenge with Large Language Models (LLMs). Particularly, we utilize the state-of-the-art LLMs to generate a graph-structural prompt to enhance the pre-trained …

abstract applications arxiv challenge cs.ai cs.cl cs.si facts graph graphs inductive knowledge knowledge graph knowledge graphs llm low reasoning textual training type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Global Data Architect, AVP - State Street Global Advisors

@ State Street | Boston, Massachusetts

Data Engineer

@ NTT DATA | Pune, MH, IN