Web: http://arxiv.org/abs/2206.10991

June 23, 2022, 1:12 a.m. | Francesco Di Giovanni, James Rowbottom, Benjamin P. Chamberlain, Thomas Markovich, Michael M. Bronstein

stat.ML updates on arXiv.org arxiv.org

Dynamical systems minimizing an energy are ubiquitous in geometry and
physics. We propose a gradient flow framework for GNNs where the equations
follow the direction of steepest descent of a learnable energy. This approach
allows to explain the GNN evolution from a multi-particle perspective as
learning attractive and repulsive forces in feature space via the positive and
negative eigenvalues of a symmetric "channel-mixing" matrix. We perform
spectral analysis of the solutions and conclude that gradient flow graph
convolutional models can …

arxiv gradient graph graph neural networks lg networks neural neural networks

More from arxiv.org / stat.ML updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY