Web: http://arxiv.org/abs/2205.03811

June 17, 2022, 1:11 a.m. | Yuanxin Zhuang, Lingjuan Lyu, Chuan Shi, Carl Yang, Lichao Sun

cs.LG updates on arXiv.org arxiv.org

Graph neural networks (GNNs) have been widely used in modeling graph
structured data, owing to its impressive performance in a wide range of
practical applications. Recently, knowledge distillation (KD) for GNNs has
enabled remarkable progress in graph model compression and knowledge transfer.
However, most of the existing KD methods require a large volume of real data,
which are not readily available in practice, and may preclude their
applicability in scenarios where the teacher model is trained on rare or hard …

arxiv data distillation free graph graph neural networks knowledge lg networks neural neural networks

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY