all AI news
Representing Long-Range Context for Graph Neural Networks with Global Attention. (arXiv:2201.08821v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
Graph neural networks are powerful architectures for structured datasets.
However, current methods struggle to represent long-range dependencies. Scaling
the depth or width of GNNs is insufficient to broaden receptive fields as
larger GNNs encounter optimization instabilities such as vanishing gradients
and representation oversmoothing, while pooling-based approaches have yet to
become as universally useful as in computer vision. In this work, we propose
the use of Transformer-based self-attention to learn long-range pairwise
relationships, with a novel "readout" mechanism to obtain a …
arxiv attention global global attention graph graph neural networks networks neural networks