all AI news
Fourier or Wavelet bases as counterpart self-attention in spikformer for efficient visual classification
March 28, 2024, 4:42 a.m. | Qingyu Wang, Duzhen Zhang, Tilelin Zhang, Bo Xu
cs.LG updates on arXiv.org arxiv.org
Abstract: Energy-efficient spikformer has been proposed by integrating the biologically plausible spiking neural network (SNN) and artificial Transformer, whereby the Spiking Self-Attention (SSA) is used to achieve both higher accuracy and lower computational cost. However, it seems that self-attention is not always necessary, especially in sparse spike-form calculation manners. In this paper, we innovatively replace vanilla SSA (using dynamic bases calculating from Query and Key) with spike-form Fourier Transform, Wavelet Transform, and their combinations (using fixed …
abstract accuracy artificial arxiv attention classification computational cost cs.cv cs.lg cs.ne energy fourier however network neural network self-attention snn spiking neural network transformer type visual wavelet
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Data Engineer
@ Kaseya | Bengaluru, Karnataka, India