Back to Search
Start Over
Class-Incremental Learning of Convolutional Neural Networks Based on Double Consolidation Mechanism
- Source :
- IEEE Access, Vol 8, Pp 172553-172562 (2020)
- Publication Year :
- 2020
- Publisher :
- IEEE, 2020.
-
Abstract
- Class-incremental learning is a model learning technique that can help classification models incrementally learn about new target classes and realize knowledge accumulation. It has become one of the major concerns of the machine learning and classification community. To overcome the catastrophic forgetting that occurs when the network is trained sequentially on a multi-class data stream, a double consolidation class-incremental learning (DCCIL) method is proposed. In the incremental learning process, the network parameters are adjusted by combining knowledge distillation and elastic weight consolidation, so that the network can better maintain the recognition ability of the old classes while learning the new ones. The incremental learning experiment is designed, and the proposed method is compared with the popular incremental learning methods such as EWC, LwF, and iCaRL. Experimental results show that the proposed DCCIL method can achieve better incremental accuracy than that of the current popular incremental learning algorithms, which can effectively improve the expansibility and intelligence of the classification model.
Details
- Language :
- English
- ISSN :
- 21693536
- Volume :
- 8
- Database :
- Directory of Open Access Journals
- Journal :
- IEEE Access
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.2f5c13fff0c4cd3bfa529909f6fa1c9
- Document Type :
- article
- Full Text :
- https://doi.org/10.1109/ACCESS.2020.3025558