Jan. 20, 2022, 2:10 a.m. | Wenzhe Guo, Mohammed E. Fouda, Ahmed M. Eltawil, Khaled Nabil Salama

cs.LG updates on arXiv.org arxiv.org

Directly training spiking neural networks (SNNs) has remained challenging due
to complex neural dynamics and intrinsic non-differentiability in firing
functions. The well-known backpropagation through time (BPTT) algorithm
proposed to train SNNs suffers from large memory footprint and prohibits
backward and update unlocking, making it impossible to exploit the potential of
locally-supervised training methods. This work proposes an efficient and direct
training algorithm for SNNs that integrates a locally-supervised training
method with a temporally-truncated BPTT algorithm. The proposed algorithm
explores both …

arxiv networks neural networks time training

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Alternant Data Engineering

@ Aspire Software | Angers, FR

Senior Software Engineer, Generative AI

@ Google | Dublin, Ireland