Back to Search Start Over

Class mean vector component and discriminant analysis

Authors :
Alexandros Iosifidis
Source :
Pattern Recognition Letters, Iosifidis, A 2020, ' Class mean vector component and discriminant analysis ', Pattern Recognition Letters, vol. 140, pp. 207-213 . https://doi.org/10.1016/j.patrec.2020.10.014

Abstract

The kernel matrix used in kernel methods encodes all the information required for solving complex nonlinear problems defined on data representations in the input space using simple, but implicitly defined, solutions. Spectral analysis on the kernel matrix defines an explicit nonlinear mapping of the input data representations to a subspace of the kernel space, which can be used for directly applying linear methods. However, the selection of the kernel subspace is crucial for the performance of the proceeding processing steps. In this paper, we propose a component analysis method for kernel-based dimensionality reduction that optimally preserves the pair-wise distances of the class means in the feature space. We provide extensive analysis on the connection of the proposed criterion to those used in kernel principal component analysis and kernel discriminant analysis, leading to a discriminant analysis version of the proposed method. Our analysis also provides more insights on the properties of the feature spaces obtained by applying these methods.<br />8 pages, 2 figures, 2 tables

Details

Language :
English
ISSN :
01678655
Volume :
140
Database :
OpenAIRE
Journal :
Pattern Recognition Letters
Accession number :
edsair.doi.dedup.....b5c60fed93a2d07dc5598f2766681326
Full Text :
https://doi.org/10.1016/j.patrec.2020.10.014