Back to Search
Start Over
Cost-Constrained Feature Optimization in Kernel Machine Classifiers
- Source :
- IEEE Signal Processing Letters. 22:2469-2473
- Publication Year :
- 2015
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2015.
-
Abstract
- Feature selection is often necessary when implementing classifiers in practice. Most approaches to feature selection are motivated by the curse of dimensionality, but few seek to mitigate the overall computational cost of feature extraction. In this work, we propose a model-based approach for addressing both objectives. The model is based around a sparse kernel machine with feature scaling parameters controlled by a beta-Bernoulli prior. The hyperparameters are controlled by each feature’s computational cost. Experiments were carried out using publicly-available data sets, and the proposed Cost-Constrained Feature Optimization (CCFO) was compared to related methods in terms of accuracy and computational reduction.
- Subjects :
- Graph kernel
Computer science
Feature vector
Feature extraction
Feature selection
Linear classifier
Feature scaling
Machine learning
computer.software_genre
k-nearest neighbors algorithm
Kernel (linear algebra)
Feature (machine learning)
Electrical and Electronic Engineering
business.industry
Applied Mathematics
Dimensionality reduction
Kanade–Lucas–Tomasi feature tracker
Kernel method
Kernel embedding of distributions
Feature (computer vision)
Signal Processing
Radial basis function kernel
Artificial intelligence
Data mining
business
Feature learning
computer
Curse of dimensionality
Subjects
Details
- ISSN :
- 15582361 and 10709908
- Volume :
- 22
- Database :
- OpenAIRE
- Journal :
- IEEE Signal Processing Letters
- Accession number :
- edsair.doi...........410681ffffad8ac85cb912263f7f84d9