Back to Search Start Over

Study on the efficiency of methods for selecting a splitting attribute for constructing decision trees.

Authors :
Mitrofanov, Sergei
Semenkin, Eugene
Source :
AIP Conference Proceedings. 2023, Vol. 2700 Issue 1, p1-8. 8p.
Publication Year :
2023

Abstract

One of the steps in building decision trees is to select a splitting attribute at each node. The quality of the classification depends on this selection. Classical decision tree learning algorithms, such as ID3 and CART, use exhaustive search over the entire attribute space. However, it is a very time-consuming process. The calculation of the objective function is performed for all objects according to all characteristics. The early studies proved that the efficiency of the hybridization of the attribute selection method and differential evolution in decision tree learning. It will help significantly speed up the learning process of a decision tree without losing the quality of classification. However, the research related to the selection of an attribute selection method has not been conducted. The authors of the paper compare nine of the most popular methods for splitting attribute selection. The attribute is of different complexity. Some methods use knowledge only about attributes; others use knowledge about class labels. The comparison was carried out while solving some classification problems. The authors selected decision tree learning time and classification accuracy were chosen as performance criteria. The estimation of methods for attribute selection is carried out as an average indicator for all classification problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0094243X
Volume :
2700
Issue :
1
Database :
Academic Search Index
Journal :
AIP Conference Proceedings
Publication Type :
Conference
Accession number :
162321677
Full Text :
https://doi.org/10.1063/5.0127003