Back to Search
Start Over
Unsupervised Feature Selection via Nonnegative Orthogonal Constrained Regularized Minimization
- Publication Year :
- 2024
-
Abstract
- Unsupervised feature selection has drawn wide attention in the era of big data since it is a primary technique for dimensionality reduction. However, many existing unsupervised feature selection models and solution methods were presented for the purpose of application, and lack of theoretical support, e.g., without convergence analysis. In this paper, we first establish a novel unsupervised feature selection model based on regularized minimization with nonnegative orthogonal constraints, which has advantages of embedding feature selection into the nonnegative spectral clustering and preventing overfitting. An effective inexact augmented Lagrangian multiplier method is proposed to solve our model, which adopts the proximal alternating minimization method to solve subproblem at each iteration. We show that the sequence generated by our method globally converges to a Karush-Kuhn-Tucker point of our model. Extensive numerical experiments on popular datasets demonstrate the stability and robustness of our method. Moreover, comparison results of algorithm performance show that our method outperforms some existing state-of-the-art methods.
- Subjects :
- Mathematics - Optimization and Control
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2403.16966
- Document Type :
- Working Paper