all AI news
AGFormer: Efficient Graph Representation with Anchor-Graph Transformer. (arXiv:2305.07521v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
To alleviate the local receptive issue of GCN, Transformers have been
exploited to capture the long range dependences of nodes for graph data
representation and learning. However, existing graph Transformers generally
employ regular self-attention module for all node-to-node message passing which
needs to learn the affinities/relationships between all node's pairs, leading
to high computational cost issue. Also, they are usually sensitive to graph
noises. To overcome this issue, we propose a novel graph Transformer
architecture, termed Anchor Graph Transformer (AGFormer), …
anchor arxiv attention data graph graph representation issue learn node relationships representation self-attention transformer transformers