1. Class-Incremental Learning with Topological Schemas of Memory Spaces
- Author
-
Xiaoyu Tao, Xinyuan Chang, Yihong Gong, Xiaopeng Hong, Xing Wei, and Wei Ke
- Subjects
Forgetting ,Artificial neural network ,business.industry ,Computer science ,Gaussian ,Posterior probability ,02 engineering and technology ,010501 environmental sciences ,Topology ,Mixture model ,01 natural sciences ,symbols.namesake ,0202 electrical engineering, electronic engineering, information engineering ,symbols ,Embedding ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,Gaussian process ,Classifier (UML) ,0105 earth and related environmental sciences - Abstract
Class-incremental learning (CIL) aims to incrementally learn a unified classifier for new classes emerging, which suffers from the catastrophic forgetting problem. To alleviate forgetting and improve the recognition performance, we propose a novel CIL framework, named the topological schemas model (TSM). TSM consists of a Gaussian mixture model arranged on 2D grids (2D-GMM) as the memory of the learned knowledge. To train the 2D-GMM model, we develop a novel competitive expectation-maximization (CEM) method, which contains a global topology embedding step and a local expectation-maximization fine-tuning step. Meanwhile, we choose the image samples of old classes that have the maximum posterior probability with respect to each Gaussian distribution as the episodic points. When finetuning for new classes, we propose the memory preservation loss (MPL) term to ensure episodic points still have maximum probabilities with respect to the corresponding Gaussian distribution. MPL preserves the distribution of 2D-GMM for old knowledge during incremental learning and alleviates catastrophic forgetting. Comprehensive experimental evaluations on two popular CIL benchmarks CIFAR100 and subImageNet demonstrate the superiority of our TSM.
- Published
- 2021
- Full Text
- View/download PDF