April 12, 2024, 4:41 a.m. | Yuwei Sun, Jun Sakuma, Ryota Kanai

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.07518v1 Announce Type: new
Abstract: Neural networks encounter the challenge of Catastrophic Forgetting (CF) in continual learning, where new task knowledge interferes with previously learned knowledge. We propose Remembering Transformer, inspired by the brain's Complementary Learning Systems (CLS), to tackle this issue. Remembering Transformer employs a mixture-of-adapters and a generative model-based routing mechanism to alleviate CF by dynamically routing task data to relevant adapters. Our approach demonstrated a new SOTA performance in various vision continual learning tasks and great parameter …

abstract arxiv brain catastrophic forgetting challenge continual cs.cv cs.lg generative issue knowledge learning systems networks neural networks routing systems transformer type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne