all AI news
Active Continual Learning: On Balancing Knowledge Retention and Learnability. (arXiv:2305.03923v2 [cs.LG] UPDATED)
cs.CL updates on arXiv.org arxiv.org
Acquiring new knowledge without forgetting what has been learned in a
sequence of tasks is the central focus of continual learning (CL). While tasks
arrive sequentially, the training data are often prepared and annotated
independently, leading to the CL of incoming supervised learning tasks. This
paper considers the under-explored problem of active continual learning (ACL)
for a sequence of active learning (AL) tasks, where each incoming task includes
a pool of unlabelled data and an annotation budget. We investigate the …
arxiv continual cs.lg data focus knowledge paper retention supervised learning tasks training training data