Web: http://arxiv.org/abs/2201.10129

Jan. 26, 2022, 2:11 a.m. | Chen Cai, Yusu Wang

cs.LG updates on arXiv.org arxiv.org

Although theoretical properties such as expressive power and over-smoothing
of graph neural networks (GNN) have been extensively studied recently, its
convergence property is a relatively new direction. In this paper, we
investigate the convergence of one powerful GNN, Invariant Graph Network (IGN)
over graphs sampled from graphons.

We first prove the stability of linear layers for general $k$-IGN (of order
$k$) based on a novel interpretation of linear equivariant layers. Building
upon this result, we prove the convergence of $k$-IGN …

arxiv graph networks

More from arxiv.org / cs.LG updates on arXiv.org

Director, Data Science (Advocacy & Nonprofit)

@ Civis Analytics | Remote

Data Engineer

@ Rappi | [CO] Bogotá

Data Scientist V, Marketplaces Personalization (Remote)

@ ID.me | United States (U.S.)

Product OPs Data Analyst (Flex/Remote)

@ Scaleway | Paris

Big Data Engineer

@ Risk Focus | Riga, Riga, Latvia

Internship Program: Machine Learning Backend

@ Nextail | Remote job