all AI news
Decouple Graph Neural Networks: Train Multiple Simple GNNs Simultaneously Instead of One
April 23, 2024, 4:43 a.m. | Hongyuan Zhang, Yanan Zhu, Xuelong Li
cs.LG updates on arXiv.org arxiv.org
Abstract: Graph neural networks (GNN) suffer from severe inefficiency. It is mainly caused by the exponential growth of node dependency with the increase of layers. It extremely limits the application of stochastic optimization algorithms so that the training of GNN is usually time-consuming. To address this problem, we propose to decouple a multi-layer GNN as multiple simple modules for more efficient training, which is comprised of classical forward training (FT)and designed backward training (BT). Under the …
abstract algorithms application arxiv cs.ai cs.lg gnn gnns graph graph neural networks growth multiple networks neural networks node optimization simple stochastic train training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Codec Avatars Research Engineer
@ Meta | Pittsburgh, PA