1. Reducing Catastrophic Forgetting in Online Class Incremental Learning Using Self-Distillation
- Author
-
Nagata, Kotaro, Ono, Hiromu, and Hotta, Kazuhiro
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
In continual learning, there is a serious problem of catastrophic forgetting, in which previous knowledge is forgotten when a model learns new tasks. Various methods have been proposed to solve this problem. Replay methods which replay data from previous tasks in later training, have shown good accuracy. However, replay methods have a generalizability problem from a limited memory buffer. In this paper, we tried to solve this problem by acquiring transferable knowledge through self-distillation using highly generalizable output in shallow layer as a teacher. Furthermore, when we deal with a large number of classes or challenging data, there is a risk of learning not converging and not experiencing overfitting. Therefore, we attempted to achieve more efficient and thorough learning by prioritizing the storage of easily misclassified samples through a new method of memory update. We confirmed that our proposed method outperformed conventional methods by experiments on CIFAR10, CIFAR100, and MiniimageNet datasets., Comment: 10 pages, 2 figures
- Published
- 2024