all AI news
Preventing Catastrophic Forgetting through Memory Networks in Continuous Detection
March 25, 2024, 4:42 a.m. | Gaurav Bhatt, James Ross, Leonid Sigal
cs.LG updates on arXiv.org arxiv.org
Abstract: Modern pre-trained architectures struggle to retain previous information while undergoing continuous fine-tuning on new tasks. Despite notable progress in continual classification, systems designed for complex vision tasks such as detection or segmentation still struggle to attain satisfactory performance. In this work, we introduce a memory-based detection transformer architecture to adapt a pre-trained DETR-style detector to new tasks while preserving knowledge from previous tasks. We propose a novel localized query function for efficient information retrieval from …
abstract architectures arxiv catastrophic forgetting classification continual continuous cs.cv cs.lg detection fine-tuning information memory modern networks performance progress segmentation struggle systems tasks through type vision work
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York