Feb. 7, 2024, 5:45 a.m. | Baolong Bi Shenghua Liu Yiwei Wang Lingrui Mei Xueqi Cheng

cs.LG updates on arXiv.org arxiv.org

Exploring the application of large language models (LLMs) to graph learning is a emerging endeavor. However, the vast amount of information inherent in large graphs poses significant challenges to this process. This work focuses on the link prediction task and introduces $\textbf{LPNL}$ (Link Prediction via Natural Language), a framework based on large language models designed for scalable link prediction on large-scale heterogeneous graphs. We design novel prompts for link prediction that articulate graph details in natural language. We propose a …

application challenges cs.ai cs.cl cs.lg cs.si endeavor framework graph graph learning graphs information language language models large language large language models link prediction llms natural natural language prediction process scalable vast via work

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote