1. Recursive Dimension Reduction for semisupervised learning.
- Author
-
Ye, Qiaolin, Yin, Tongming, Gao, Shangbing, Jing, Jiajia, Zhang, Yu, and Sun, Cuiping
- Subjects
- *
RECURSIVE functions , *DIMENSION reduction (Statistics) , *LEARNING , *ITERATIVE methods (Mathematics) , *SUBSPACES (Mathematics) - Abstract
Semisupervised Dimension Reduction (SDR) Using Trace Ratio Criterion (TR-FSDA) is an effective iterative SDR algorithm, which introduces a flexible regularization term ‖ F − X T W ‖ 2 to relax such a hard linear constraint in SDA that the low-dimensional representation F is constrained to lie in the linear subspace spanned by the data matrix X . We, however, observe that TR-FSDA may take some meaningless features in the iteration and cannot be always guaranteed to converge. In this paper, we propose a novel method for SDR, referred to as Recursive Dimension Reduction for Semisupervised Learning (RDS). Instead of solving the non-trivial TR problem using the iterative algorithm of TR-FSDA, we solve the objective function of TR-FSDA using a newly-developed recursive procedure. In each iteration, only a projection vector and a one-dimensional data representation are produced by solving a standard Rayleigh Quotient problem. Our algorithm escapes from the convergence guarantee, since it directly solves the objective and requires no any iterative strategy in finding each of the projection vectors. The experiments on four face databases, one object database, one shape image database, and one Handwritten Digit database demonstrate the effectiveness of RDS. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF