Balanced softmax cross-entropy for incremental learning

Q Jodelet, X Liu, T Murata - International conference on artificial neural …, 2021 - Springer
Deep neural networks are prone to catastrophic forgetting when incrementally trained on
new classes or new tasks as adaptation to the new data leads to a drastic decrease of the
performance on the old classes and tasks. By using a small memory for rehearsal and
knowledge distillation, recent methods have proven to be effective to mitigate catastrophic
forgetting. However due to the limited size of the memory, large imbalance between the
amount of data available for the old and new classes still remains which results in a …

[HTML][HTML] Balanced softmax cross-entropy for incremental learning with and without memory

Q Jodelet, X Liu, T Murata - Computer Vision and Image Understanding, 2022 - Elsevier
When incrementally trained on new classes, deep neural networks are subject to
catastrophic forgetting which leads to an extreme deterioration of their performance on the
old classes while learning the new ones. Using a small memory containing few samples
from past classes has shown to be an effective method to mitigate catastrophic forgetting.
However, due to the limited size of the replay memory, there is a large imbalance between
the number of samples for the new and the old classes in the training dataset resulting in …
Showing the best results for this search. See all results