all AI news
IMEX-Reg: Implicit-Explicit Regularization in the Function Space for Continual Learning
April 30, 2024, 4:42 a.m. | Prashant Bhat, Bharath Renjith, Elahe Arani, Bahram Zonooz
cs.LG updates on arXiv.org arxiv.org
Abstract: Continual learning (CL) remains one of the long-standing challenges for deep neural networks due to catastrophic forgetting of previously acquired knowledge. Although rehearsal-based approaches have been fairly successful in mitigating catastrophic forgetting, they suffer from overfitting on buffered samples and prior information loss, hindering generalization under low-buffer regimes. Inspired by how humans learn using strong inductive biases, we propose IMEX-Reg to improve the generalization performance of experience rehearsal in CL under low buffer regimes. Specifically, …
abstract acquired arxiv catastrophic forgetting challenges continual cs.ai cs.cv cs.lg function information knowledge loss networks neural networks overfitting prior regularization samples space type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US