March 19, 2024, 4:42 a.m. | MoonJeong Park, Jaeseung Heo, Dongwoo Kim

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.10543v1 Announce Type: cross
Abstract: Graph Neural Network (GNN) resembles the diffusion process, leading to the over-smoothing of learned representations when stacking many layers. Hence, the reverse process of message passing can sharpen the node representations by inverting the forward message propagation. The sharpened representations can help us to better distinguish neighboring nodes with different labels, such as in heterophilic graphs. In this work, we apply the design principle of the reverse process to the three variants of the GNNs. …

abstract arxiv cs.lg cs.si diffusion gnn gnns graph graph neural network graphs network neural network node process propagation through type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Director, Clinical Data Science

@ Aura | Remote USA

Research Scientist, AI (PhD)

@ Meta | Menlo Park, CA | New York City