Dec. 16, 2023, 7:49 a.m. | /u/APaperADay

Machine Learning www.reddit.com

**arXiv**: [https://arxiv.org/abs/2312.00276](https://arxiv.org/abs/2312.00276)

**OpenReview**: [https://openreview.net/forum?id=5twh6pM4SR](https://openreview.net/forum?id=5twh6pM4SR)

**Code**: [https://github.com/IDSIA/automated-cl](https://github.com/IDSIA/automated-cl)

**Abstract**:

>General-purpose learning systems should improve themselves in open-ended fashion in ever-changing environments. Conventional learning algorithms for neural networks, however, suffer from catastrophic forgetting (CF) -- previously acquired skills are forgotten when a new task is learned. Instead of hand-crafting new algorithms for avoiding CF, we propose Automated Continual Learning (ACL) to train self-referential neural networks to meta-learn their own in-context continual (meta-)learning algorithms. ACL encodes all desiderata -- good performance on both old …

abstract acl acquired algorithms automated catastrophic forgetting continual environments fashion general learn learning systems machinelearning meta networks neural networks skills systems train

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote