Web: http://arxiv.org/abs/2206.11849

June 24, 2022, 1:10 a.m. | Mattia Sangermano, Antonio Carta, Andrea Cossu, Davide Bacciu

cs.LG updates on arXiv.org arxiv.org

Online Continual learning is a challenging learning scenario where the model
must learn from a non-stationary stream of data where each sample is seen only
once. The main challenge is to incrementally learn while avoiding catastrophic
forgetting, namely the problem of forgetting previously acquired knowledge
while learning from new data. A popular solution in these scenario is to use a
small memory to retain old data and rehearse them over time. Unfortunately, due
to the limited memory size, the quality …

arxiv continual learning lg online

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY