all AI news
Linkless Link Prediction via Relational Distillation. (arXiv:2210.05801v1 [cs.LG])
Oct. 13, 2022, 1:11 a.m. | Zhichun Guo, William Shiao, Shichang Zhang, Yozen Liu, Nitesh Chawla, Neil Shah, Tong Zhao
cs.LG updates on arXiv.org arxiv.org
Graph Neural Networks (GNNs) have been widely used on graph data and have
shown exceptional performance in the task of link prediction. Despite their
effectiveness, GNNs often suffer from high latency due to non-trivial
neighborhood data dependency in practical deployments. To address this issue,
researchers have proposed methods based on knowledge distillation (KD) to
transfer the knowledge from teacher GNNs to student MLPs, which are known to be
efficient even with industrial scale data, and have shown promising results on …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote