all AI news
SIESTA: Efficient Online Continual Learning with Sleep. (arXiv:2303.10725v3 [cs.CV] UPDATED)
cs.LG updates on arXiv.org arxiv.org
In supervised continual learning, a deep neural network (DNN) is updated with
an ever-growing data stream. Unlike the offline setting where data is shuffled,
we cannot make any distributional assumptions about the data stream. Ideally,
only one pass through the dataset is needed for computational efficiency.
However, existing methods are inadequate and make many assumptions that cannot
be made for real-world applications, while simultaneously failing to improve
computational efficiency. In this paper, we propose a novel continual learning
method, SIESTA …
arxiv assumptions computational continual data dataset data stream deep neural network dnn efficiency network neural network offline sleep through