all AI news
GRPE: Relative Positional Encoding for Graph Transformer. (arXiv:2201.12787v3 [cs.LG] UPDATED)
Oct. 17, 2022, 1:12 a.m. | Wonpyo Park, Woonggi Chang, Donggeon Lee, Juntae Kim, Seung-won Hwang
cs.LG updates on arXiv.org arxiv.org
We propose a novel positional encoding for learning graph on Transformer
architecture. Existing approaches either linearize a graph to encode absolute
position in the sequence of nodes, or encode relative position with another
node using bias terms. The former loses preciseness of relative position from
linearization, while the latter loses a tight integration of node-edge and
node-topology interaction. To overcome the weakness of the previous approaches,
our method encodes a graph without linearization and considers both
node-topology and node-edge interaction. …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne