all AI news
Balanced softmax cross-entropy for incremental learning with and without memory. (arXiv:2103.12532v5 [cs.LG] UPDATED)
Nov. 15, 2022, 2:15 a.m. | Quentin Jodelet, Xin Liu, Tsuyoshi Murata
cs.CV updates on arXiv.org arxiv.org
When incrementally trained on new classes, deep neural networks are subject
to catastrophic forgetting which leads to an extreme deterioration of their
performance on the old classes while learning the new ones. Using a small
memory containing few samples from past classes has shown to be an effective
method to mitigate catastrophic forgetting. However, due to the limited size of
the replay memory, there is a large imbalance between the number of samples for
the new and the old classes …
More from arxiv.org / cs.CV updates on arXiv.org
Compact 3D Scene Representation via Self-Organizing Gaussian Grids
1 day, 8 hours ago |
arxiv.org
Fingerprint Matching with Localized Deep Representation
1 day, 8 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne