all AI news
[R] Learning Long Sequences in Spiking Neural Networks
Jan. 11, 2024, 4:16 p.m. | /u/APaperADay
Machine Learning www.reddit.com
**Abstract**:
>Spiking neural networks (SNNs) take inspiration from the brain to enable energy-efficient computations. Since the advent of Transformers, SNNs have struggled to compete with artificial networks on modern sequential tasks, as they inherit limitations from recurrent neural networks (RNNs), with the added challenge of training with non-differentiable binary spiking activations. However, a recent renewed interest in efficient alternatives to Transformers has given rise to state-of-the-art recurrent architectures named state space models (SSMs). This work systematically investigates, for …
abstract artificial binary brain challenge differentiable energy inspiration limitations machinelearning modern networks neural networks recurrent neural networks spiking neural networks tasks training transformers
More from www.reddit.com / Machine Learning
[D] Kolmogorov-Arnold Network is just an MLP
1 day, 3 hours ago |
www.reddit.com
[D] Why Gemma has such crazy big MLP hidden dim size?
1 day, 3 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Consultant - Artificial Intelligence & Data (Google Cloud Data Engineer) - MY / TH
@ Deloitte | Kuala Lumpur, MY