all AI news
Improving Performance in Continual Learning Tasks using Bio-Inspired Architectures. (arXiv:2308.04539v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
The ability to learn continuously from an incoming data stream without
catastrophic forgetting is critical to designing intelligent systems. Many
approaches to continual learning rely on stochastic gradient descent and its
variants that employ global error updates, and hence need to adopt strategies
such as memory buffers or replay to circumvent its stability, greed, and
short-term memory limitations. To address this limitation, we have developed a
biologically inspired lightweight neural network architecture that incorporates
synaptic plasticity mechanisms and neuromodulation and …
architectures arxiv bio bio-inspired continual data data stream error global gradient intelligent intelligent systems learn memory performance stochastic strategies systems updates variants