all AI news
Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs. (arXiv:2304.11140v1 [stat.ML])
stat.ML updates on arXiv.org arxiv.org
We study the convergence of message passing graph neural networks on random
graph models to their continuous counterpart as the number of nodes tends to
infinity. Until now, this convergence was only known for architectures with
aggregation functions in the form of degree-normalized means. We extend such
results to a very large class of aggregation functions, that encompasses all
classically used message passing graph neural networks, such as attention-based
mesage passing or max convolutional message passing on top of
(degree-normalized) …
aggregation architectures arxiv attention continuous convergence graph graph neural networks graphs max networks neural networks random study