all AI news
Efficient Training of Spiking Neural Networks with Temporally-Truncated Local Backpropagation through Time. (arXiv:2201.07210v1 [cs.NE])
Jan. 20, 2022, 2:10 a.m. | Wenzhe Guo, Mohammed E. Fouda, Ahmed M. Eltawil, Khaled Nabil Salama
cs.LG updates on arXiv.org arxiv.org
Directly training spiking neural networks (SNNs) has remained challenging due
to complex neural dynamics and intrinsic non-differentiability in firing
functions. The well-known backpropagation through time (BPTT) algorithm
proposed to train SNNs suffers from large memory footprint and prohibits
backward and update unlocking, making it impossible to exploit the potential of
locally-supervised training methods. This work proposes an efficient and direct
training algorithm for SNNs that integrates a locally-supervised training
method with a temporally-truncated BPTT algorithm. The proposed algorithm
explores both …
More from arxiv.org / cs.LG updates on arXiv.org
A Single-Loop Algorithm for Decentralized Bilevel Optimization
1 day, 4 hours ago |
arxiv.org
CLEANing Cygnus A deep and fast with R2D2
1 day, 4 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Alternant Data Engineering
@ Aspire Software | Angers, FR
Senior Software Engineer, Generative AI
@ Google | Dublin, Ireland