Web: http://arxiv.org/abs/2201.11915

Jan. 31, 2022, 2:11 a.m. | Jason K. Eshraghian, Wei D. Lu

cs.LG updates on arXiv.org arxiv.org

Spiking neural networks can compensate for quantization error by encoding
information either in the temporal domain, or by processing discretized
quantities in hidden states of higher precision. In theory, a wide dynamic
range state-space enables multiple binarized inputs to be accumulated together,
thus improving the representational capacity of individual neurons. This may be
achieved by increasing the firing threshold, but make it too high and sparse
spike activity turns into no spike emission. In this paper, we propose the use …

arxiv line networks neural neural networks neurons sparsity

More from arxiv.org / cs.LG updates on arXiv.org

Director, Data Engineering and Architecture

@ Chainalysis | California | New York | Washington DC | Remote - USA

Deep Learning Researcher

@ Topaz Labs | Dallas, TX

Sr Data Engineer (Contractor)

@ SADA | US - West

Senior Cloud Database Administrator

@ Findhelp | Remote

Senior Data Analyst

@ System1 | Remote

Speech Machine Learning Research Engineer

@ Samsung Research America | Mountain View, CA