March 11, 2024, 4:41 a.m. | Gido M. van de Ven, Nicholas Soures, Dhireesha Kudithipudi

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.05175v1 Announce Type: new
Abstract: This book chapter delves into the dynamics of continual learning, which is the process of incrementally learning from a non-stationary stream of data. Although continual learning is a natural skill for the human brain, it is very challenging for artificial neural networks. An important reason is that, when learning something new, these networks tend to quickly and drastically forget what they had learned before, a phenomenon known as catastrophic forgetting. Especially in the last decade, …

abstract artificial artificial neural networks arxiv book brain catastrophic forgetting continual cs.ai cs.cv cs.lg data dynamics human natural networks neural networks process q-bio.nc reason something stat.ml type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Science Analyst- ML/DL/LLM

@ Mayo Clinic | Jacksonville, FL, United States

Machine Learning Research Scientist, Robustness and Uncertainty

@ Nuro, Inc. | Mountain View, California (HQ)