51. Learning for two-dimensional principal component analysis
- Author
-
Liang-Hwa Chen, Po-Lun Chang, and Chun-Hong Huang
- Subjects
Artificial neural network ,Wake-sleep algorithm ,Covariance matrix ,business.industry ,Computer science ,Time delay neural network ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Cognitive neuroscience of visual object recognition ,Pattern recognition ,Facial recognition system ,ComputingMethodologies_PATTERNRECOGNITION ,Computer Science::Computer Vision and Pattern Recognition ,Principal component analysis ,Artificial intelligence ,business ,Eigenvalues and eigenvectors - Abstract
Two-dimensional principal component analysis (2D-PCA) which is based on 2D image matrices as opposed to the standard PCA, which is based on 1D vectors, has been first successfully applied to face recognition and has higher accuracy than the latter. It was also successfully applied to other problems such as facial expression recognition, object recognition, etc. later. However, there exists still no neural network learning algorithm for 2D-PCA like those for PCA yet. In this paper, we propose a learning algorithm for 2D-PCA. Requiring no image covariance matrix evaluation and just repeatedly presenting training image samples to the single layer neural network, the desired multiple eigenvectors for 2D-PCA can be learned in the form of weight vectors of generalized linear neurons. It also profit from the parallel architecture of neural network. Simulation experiments are performed on YaleB face database, and the experimental results show that the proposed learning algorithm performs well as expected.
- Published
- 2010