all AI news
Not too little, not too much: a theoretical analysis of graph (over)smoothing. (arXiv:2205.12156v1 [stat.ML])
May 25, 2022, 1:11 a.m. | Nicolas Keriven
stat.ML updates on arXiv.org arxiv.org
We analyze graph smoothing with \emph{mean aggregation}, where each node
successively receives the average of the features of its neighbors. Indeed, it
has quickly been observed that Graph Neural Networks (GNNs), which generally
follow some variant of Message-Passing (MP) with repeated aggregation, may be
subject to the \emph{oversmoothing} phenomenon: by performing too many rounds
of MP, the node features tend to converge to a non-informative limit. In the
case of mean aggregation, for connected graphs, the node features become
constant …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Applied Scientist, Control Stack, AWS Center for Quantum Computing
@ Amazon.com | Pasadena, California, USA
Specialist Marketing with focus on ADAS/AD f/m/d
@ AVL | Graz, AT
Machine Learning Engineer, PhD Intern
@ Instacart | United States - Remote
Supervisor, Breast Imaging, Prostate Center, Ultrasound
@ University Health Network | Toronto, ON, Canada
Senior Manager of Data Science (Recommendation Science)
@ NBCUniversal | New York, NEW YORK, United States