Oct. 3, 2022, 1:12 a.m. | Zhaokun Zhou, Yuesheng Zhu, Chao He, Yaowei Wang, Shuicheng Yan, Yonghong Tian, Li Yuan

cs.LG updates on arXiv.org arxiv.org

We consider two biologically plausible structures, the Spiking Neural Network
(SNN) and the self-attention mechanism. The former offers an energy-efficient
and event-driven paradigm for deep learning, while the latter has the ability
to capture feature dependencies, enabling Transformer to achieve good
performance. It is intuitively promising to explore the marriage between them.
In this paper, we consider leveraging both self-attention capability and
biological properties of SNNs, and propose a novel Spiking Self Attention (SSA)
as well as a powerful framework, …

arxiv network neural network spiking neural network transformer

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US