May 15, 2023, 12:44 a.m. | Bo Jiang, Fei Xu, Ziyan Zhang, Jin Tang, Feiping Nie

cs.LG updates on arXiv.org arxiv.org

To alleviate the local receptive issue of GCN, Transformers have been
exploited to capture the long range dependences of nodes for graph data
representation and learning. However, existing graph Transformers generally
employ regular self-attention module for all node-to-node message passing which
needs to learn the affinities/relationships between all node's pairs, leading
to high computational cost issue. Also, they are usually sensitive to graph
noises. To overcome this issue, we propose a novel graph Transformer
architecture, termed Anchor Graph Transformer (AGFormer), …

anchor arxiv attention data graph graph representation issue learn node relationships representation self-attention transformer transformers

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote