Back to Search Start Over

An overview of incremental feature extraction methods based on linear subspaces

Authors :
Aura Hernández-Sabaté
Francesc J. Ferri
Katerine Diaz-Chito
Source :
Knowledge-Based Systems. 145:219-235
Publication Year :
2018
Publisher :
Elsevier BV, 2018.

Abstract

With the massive explosion of machine learning in our day-to-day life, incremental and adaptive learning has become a major topic, crucial to keep up-to-date and improve classification models and their corresponding feature extraction processes. This paper presents a categorized overview of incremental feature extraction based on linear subspace methods which aim at incorporating new information to the already acquired knowledge without accessing previous data. Specifically, this paper focuses on those linear dimensionality reduction methods with orthogonal matrix constraints based on global loss function, due to the extensive use of their batch approaches versus other linear alternatives. Thus, we cover the approaches derived from Principal Components Analysis, Linear Discriminative Analysis and Discriminative Common Vector methods. For each basic method, its incremental approaches are differentiated according to the subspace model and matrix decomposition involved in the updating process. Besides this categorization, several updating strategies are distinguished according to the amount of data used to update and to the fact of considering a static or dynamic number of classes. Moreover, the specific role of the size/dimension ratio in each method is considered. Finally, computational complexity, experimental setup and the accuracy rates according to published results are compiled and analyzed, and an empirical evaluation is done to compare the best approach of each kind.

Details

ISSN :
09507051
Volume :
145
Database :
OpenAIRE
Journal :
Knowledge-Based Systems
Accession number :
edsair.doi...........ac97c760e5b7c36f717e61eaa5c4dd81