Feb. 8, 2024, 5:43 a.m. | Chi Ian Tang Lorena Qendro Dimitris Spathis Fahim Kawsar Cecilia Mascolo Akhil Mathur

cs.LG updates on arXiv.org arxiv.org

Self-supervised learning (SSL) has shown remarkable performance in computer vision tasks when trained offline. However, in a Continual Learning (CL) scenario where new data is introduced progressively, models still suffer from catastrophic forgetting. Retraining a model from scratch to adapt to newly generated data is time-consuming and inefficient. Previous approaches suggested re-purposing self-supervised objectives with knowledge distillation to mitigate forgetting across tasks, assuming that labels from all tasks are available during fine-tuning. In this paper, we generalize self-supervised continual learning …

adapt catastrophic forgetting computer computer vision continual cs.lg data fine-tuning generated offline performance practical retraining self-supervised learning ssl supervised learning tasks vision

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Data Architect

@ S&P Global | IN - HYDERABAD SKYVIEW

Data Architect I

@ S&P Global | US - VA - CHARLOTTESVILLE 212 7TH STREET