all AI news
Input correlations impede suppression of chaos and learning in balanced rate networks. (arXiv:2201.09916v1 [q-bio.NC])
Jan. 26, 2022, 2:10 a.m. | Rainer Engelken, Alessandro Ingrosso, Ramin Khajeh, Sven Goedeke, L. F. Abbott
cs.LG updates on arXiv.org arxiv.org
Neural circuits exhibit complex activity patterns, both spontaneously and
evoked by external stimuli. Information encoding and learning in neural
circuits depend on how well time-varying stimuli can control spontaneous
network activity. We show that in firing-rate networks in the balanced state,
external control of recurrent dynamics, i.e., the suppression of
internally-generated chaotic variability, strongly depends on correlations in
the input. A unique feature of balanced networks is that, because common
external input is dynamically canceled by recurrent feedback, it is …
More from arxiv.org / cs.LG updates on arXiv.org
A Single-Loop Algorithm for Decentralized Bilevel Optimization
1 day, 8 hours ago |
arxiv.org
CLEANing Cygnus A deep and fast with R2D2
1 day, 8 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Management Associate
@ EcoVadis | Ebène, Mauritius
Senior Data Engineer
@ Telstra | Telstra ICC Bengaluru