all AI news
Representation learnt by SGD and Adaptive learning rules -- Conditions that Vary Sparsity and Selectivity in Neural Network. (arXiv:2201.11653v1 [cs.LG])
Web: http://arxiv.org/abs/2201.11653
cs.LG updates on arXiv.org arxiv.org
From the point of view of the human brain, continual learning can perform
various tasks without mutual interference. An effective way to reduce mutual
interference can be found in sparsity and selectivity of neurons. According to
Aljundi et al. and Hadsell et al., imposing sparsity at the representational
level is advantageous for continual learning because sparse neuronal
activations encourage less overlap between parameters, resulting in less
interference. Similarly, highly selective neural networks are likely to induce
less interference since particular …
arxiv learning network neural neural network representation sparsity