all AI news
The fine line between dead neurons and sparsity in binarized spiking neural networks. (arXiv:2201.11915v1 [cs.NE])
Web: http://arxiv.org/abs/2201.11915
Jan. 31, 2022, 2:11 a.m. | Jason K. Eshraghian, Wei D. Lu
cs.LG updates on arXiv.org arxiv.org
Spiking neural networks can compensate for quantization error by encoding
information either in the temporal domain, or by processing discretized
quantities in hidden states of higher precision. In theory, a wide dynamic
range state-space enables multiple binarized inputs to be accumulated together,
thus improving the representational capacity of individual neurons. This may be
achieved by increasing the firing threshold, but make it too high and sparse
spike activity turns into no spike emission. In this paper, we propose the use …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Director, Data Engineering and Architecture
@ Chainalysis | California | New York | Washington DC | Remote - USA
Deep Learning Researcher
@ Topaz Labs | Dallas, TX
Sr Data Engineer (Contractor)
@ SADA | US - West
Senior Cloud Database Administrator
@ Findhelp | Remote
Senior Data Analyst
@ System1 | Remote
Speech Machine Learning Research Engineer
@ Samsung Research America | Mountain View, CA