April 24, 2023, 12:45 a.m. | Romain Menegaux, Emmanuel Jehanno, Margot Selosse, Julien Mairal

cs.LG updates on arXiv.org arxiv.org

We introduce a novel self-attention mechanism, which we call CSA (Chromatic
Self-Attention), which extends the notion of attention scores to attention
_filters_, independently modulating the feature channels. We showcase CSA in a
fully-attentional graph Transformer CGT (Chromatic Graph Transformer) which
integrates both graph structural information and edge features, completely
bypassing the need for local message-passing components. Our method flexibly
encodes graph structure through node-node interactions, by enriching the
original edge features with a relative positional encoding scheme. We propose a …

arxiv attention call colors components edge encoding feature features graph information interactions node notion novel positional encoding random self-attention transformer transformers

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne