all AI news
Self-Attention Empowered Graph Convolutional Network for Structure Learning and Node Embedding
March 7, 2024, 5:41 a.m. | Mengying Jiang, Guizhong Liu, Yuanchao Su, Xinliang Wu
cs.LG updates on arXiv.org arxiv.org
Abstract: In representation learning on graph-structured data, many popular graph neural networks (GNNs) fail to capture long-range dependencies, leading to performance degradation. Furthermore, this weakness is magnified when the concerned graph is characterized by heterophily (low homophily). To solve this issue, this paper proposes a novel graph learning framework called the graph convolutional network with self-attention (GCN-SA). The proposed scheme exhibits an exceptional generalization capability in node-level representation learning. The proposed GCN-SA contains two enhancements corresponding …
abstract arxiv attention cs.lg cs.si data dependencies embedding gnns graph graph neural networks issue low network networks neural networks node paper performance popular representation representation learning self-attention solve structured data type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Business Intelligence Architect - Specialist
@ Eastman | Hyderabad, IN, 500 008