June 23, 2022, 1:12 a.m. | Francesco Di Giovanni, James Rowbottom, Benjamin P. Chamberlain, Thomas Markovich, Michael M. Bronstein

stat.ML updates on arXiv.org arxiv.org

Dynamical systems minimizing an energy are ubiquitous in geometry and
physics. We propose a gradient flow framework for GNNs where the equations
follow the direction of steepest descent of a learnable energy. This approach
allows to explain the GNN evolution from a multi-particle perspective as
learning attractive and repulsive forces in feature space via the positive and
negative eigenvalues of a symmetric "channel-mixing" matrix. We perform
spectral analysis of the solutions and conclude that gradient flow graph
convolutional models can …

arxiv gradient graph graph neural networks lg networks neural networks

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

IT Commercial Data Analyst - ESO

@ National Grid | Warwick, GB, CV34 6DA

Stagiaire Data Analyst – Banque Privée - Juillet 2024

@ Rothschild & Co | Paris (Messine-29)

Operations Research Scientist I - Network Optimization Focus

@ CSX | Jacksonville, FL, United States

Machine Learning Operations Engineer

@ Intellectsoft | Baku, Baku, Azerbaijan - Remote

Data Analyst

@ Health Care Service Corporation | Richardson Texas HQ (1001 E. Lookout Drive)