Feb. 8, 2024, 5:44 a.m. | Tommaso Salvatori Yuhang Song Yordan Yordanov Beren Millidge Zhenghua Xu Lei Sha Cornelius Emde

cs.LG updates on arXiv.org arxiv.org

Predictive coding networks are neuroscience-inspired models with roots in both Bayesian statistics and neuroscience. Training such models, however, is quite inefficient and unstable. In this work, we show how by simply changing the temporal scheduling of the update rule for the synaptic weights leads to an algorithm that is much more efficient and stable than the original one, and has theoretical guarantees in terms of convergence. The proposed algorithm, that we call incremental predictive coding (iPC) is also more biologically …

algorithm bayesian coding cs.ai cs.lg cs.ne leads networks neuroscience predictive scheduling show statistics temporal training update work

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Sr. BI Analyst

@ AkzoNobel | Pune, IN