Back to Search Start Over

Kernel Discrimination and Explicative Features: an Operative Approach

Authors :
Colubi, A
Fokianos, K
Gonza ́lez-Rodrıguez, J
Kontoghiorghes, EJ
Liberati, C
Camillo, F
Saporta, G
LIBERATI, CATERINA
Saporta, G.
Colubi, A
Fokianos, K
Gonza ́lez-Rodrıguez, J
Kontoghiorghes, EJ
Liberati, C
Camillo, F
Saporta, G
LIBERATI, CATERINA
Saporta, G.
Publication Year :
2012

Abstract

Kernel-based methods such as SVMs and LS-SVMs have been successfully used for solving various supervised classification and pattern recognition problems in machine learning. Unfortunately, they are heavily dependent on the choice of the optimal kernel function and from tuning parameters. Their solutions, in fact, suffer of complete lack of interpretation in terms of input variables. That is not a banal problem, especially when the learning task is related with a critical asset of a business, like credit scoring, where deriving a classification rule has to respect an international regulation. The following strategy is proposed for solving problems using categorical predictors: replace the predictors by components issued from MCA, choice of the best kernel among several ones (linear ,RBF, Laplace, Cauchy, etc.), approximation of the classifier through a linear model. The loss of performance due to such approximation is balanced by better interpretability for the end user, employed in order to understand and to rank the influence of each category of the variables set in the prediction. This strategy has been applied to real risk-credit data of small enterprises. Cauchy kernel was found the best and leads to a score much more efficient than classical ones, even after approximation.

Details

Database :
OAIster
Notes :
CD-ROM, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1311386800
Document Type :
Electronic Resource