Back to Search Start Over

Combining Lazy Learning, Racing and Subsampling for Effective Feature Selection.

Authors :
Ribeiro, Bernardete
Albrecht, Rudolf F.
Dobnikar, Andrej
Pearson, David W.
Steele, Nigel C.
Bontempi, Gianluca
Birattari, Mauro
Meyer, Patrick E.
Source :
Adaptive & Natural Computing Algorithms; 2005, p393-396, 4p
Publication Year :
2005

Abstract

This paper presents a wrapper method for feature selection that combines Lazy Learning, racing and subsampling techniques. Lazy Learning (LL) is a local learning technique that, once a query is received, extracts a prediction by locally interpolating the neighboring examples of the query which are considered relevant according to a distance measure. Local learning techniques are often criticized for their limitations in dealing with problems with high number of features and large samples. Similarly wrapper methods are considered prohibitive for large number of features, due to the high cost of the evaluation step. The paper aims to show that a wrapper feature selection method based on LL can take advantage of two effective strategies: racing and subsampling. While the idea of racing was already proposed by Maron and Moore, this paper goes a step further by (i) proposing a multiple testing technique for less conservative racing (ii) combining racing with sub-sampling techniques. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISBNs :
9783642049200
Database :
Complementary Index
Journal :
Adaptive & Natural Computing Algorithms
Publication Type :
Book
Accession number :
26196342
Full Text :
https://doi.org/10.1007/3-211-27389-1•95