March 20, 2024, 4:43 a.m. | Bohan Tang, Siheng Chen, Xiaowen Dong

cs.LG updates on arXiv.org arxiv.org

arXiv:2312.09778v2 Announce Type: replace
Abstract: Hypergraphs are vital in modelling data with higher-order relations containing more than two entities, gaining prominence in machine learning and signal processing. Many hypergraph neural networks leverage message passing over hypergraph structures to enhance node representation learning, yielding impressive performances in tasks like hypergraph node classification. However, these message-passing-based models face several challenges, including oversmoothing as well as high latency and sensitivity to structural perturbations at inference time. To tackle those challenges, we propose an …

abstract arxiv classification cs.lg data eess.sp however hypergraph machine machine learning mlp modelling networks neural networks node performances processing relations representation representation learning signal tasks type vital

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120