151. Fast orthogonal locality-preserving projections for unsupervised feature selection.
- Author
-
Zhu, Jianyong, Chen, Jingwei, Xu, Bin, Yang, Hui, and Nie, Feiping
- Subjects
- *
ORTHOGRAPHIC projection , *REGULARIZATION parameter , *LINEAR operators , *SPARSE matrices , *GRAPH theory - Abstract
Graph-based sparsity learning is one of the most successful unsupervised feature selection methods that has been widely adopted in many real-world applications. However, traditional graph-based unsupervised feature selection methods have several drawbacks: (1) being time-consuming and unable to deal with large-scale problems; (2) having difficulty tuning the regularization parameter with the sparsity regularization term; and (3) being unable to find explicit solutions owing to the limitation of sparsity, that is, feature selection with the ℓ 2 , 1 -norm constrained problem. Thus, this paper proposes OLPPFS, a method to preserve the local geometric structure within the feature subspace by imposing the ℓ 2 , 0 -norm constraint. First, the linear mapping capability of the proposed model is enhanced using locality-preserving projections (LPPs), whichpreserve the local and global geometric manifold structure of the data while enhancing the ability to reconstruct data. Second, the graph-embedding learning method can accelerate the construction of a sparsity affinity graph and describe the intrinsic structure of the dataset well. More importantly, we propose a method for solving a projection matrix with the ℓ 2 , 0 -norm constrained, which can accurately select a explicit group of discriminative feature subsets. This method can yield a more accurate sparse projection matrix than the ℓ 2 , 1 -norm. We also adopt FOLPPFS, an effective anchor-based strategy to further accelerate our model with two flexible options. Extensive experiments on eight datasets demonstrate that the proposed method is superior to the other methods and can preserve a better local geometric structure of the dataset with less time consumption. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF