Sept. 28, 2022, 1:12 a.m. | Beibei Wang, Bo Jiang

cs.LG updates on arXiv.org arxiv.org

Graph Attention Networks (GATs) have been intensively studied and widely used
in graph data learning tasks. Existing GATs generally adopt the self-attention
mechanism to conduct graph edge attention learning, requiring expensive
computation. It is known that Spiking Neural Networks (SNNs) can perform
inexpensive computation by transmitting the input signal data into discrete
spike trains and can also return sparse outputs. Inspired by the merits of
SNNs, in this work, we propose a novel Graph Spiking Attention Network (GSAT)
for graph …

arxiv graph network neural network spiking neural network

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US