Oct. 3, 2022, 1:12 a.m. | Zhaokun Zhou, Yuesheng Zhu, Chao He, Yaowei Wang, Shuicheng Yan, Yonghong Tian, Li Yuan

cs.LG updates on arXiv.org arxiv.org

We consider two biologically plausible structures, the Spiking Neural Network
(SNN) and the self-attention mechanism. The former offers an energy-efficient
and event-driven paradigm for deep learning, while the latter has the ability
to capture feature dependencies, enabling Transformer to achieve good
performance. It is intuitively promising to explore the marriage between them.
In this paper, we consider leveraging both self-attention capability and
biological properties of SNNs, and propose a novel Spiking Self Attention (SSA)
as well as a powerful framework, …

arxiv network neural network spiking neural network transformer

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV

GN SONG MT Market Research Data Analyst 11

@ Accenture | Bengaluru, BDC7A