Nov. 3, 2022, 1:12 a.m. | Guanzi Chen, Jiying Zhang, Xi Xiao, Yang Li

cs.LG updates on arXiv.org arxiv.org

In recent years, hypergraph learning has attracted great attention due to its
capacity in representing complex and high-order relationships. However, current
neural network approaches designed for hypergraphs are mostly shallow, thus
limiting their ability to extract information from high-order neighbors. In
this paper, we show both theoretically and empirically, that the performance of
hypergraph neural networks does not improve as the number of layers increases,
which is known as the over-smoothing problem. To avoid this issue, we develop a
new …

arxiv hypergraph networks neural networks

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Consultant Senior Power BI & Azure - CDI - H/F

@ Talan | Lyon, France