Back to Search Start Over

A local opposition-learning golden-sine grey wolf optimization algorithm for feature selection in data classification.

Authors :
Li, Zhang
Source :
Applied Soft Computing; Jul2023, Vol. 142, pN.PAG-N.PAG, 1p
Publication Year :
2023

Abstract

The classification problem is an important research topic in machine learning and data mining. Feature selection can remove irrelevant and redundant features and improve classification accuracy. The traditional Grey Wolf algorithm (GWO) has the defects of low convergence efficiency and easy to falls into local extremes in solving the feature selection process, leading to ineffective removal of irrelevant and redundant features. This paper proposes a binary version of the local opposing learning golden sine grey wolf optimization algorithm (OGGWO). First, the OGGWO algorithm uses local opposing learning mapping to initialize the positions of individual grey wolves to enrich population diversity and improve convergence speed. Secondly, mix the golden sine algorithm and the grey wolf optimization algorithm to control the direction and distance of α wolves by using the golden mean coefficient to improve the autonomous search ability of individual grey wolves and avoid the algorithm from falling into the local optimum. Finally, the updated grey wolf position is binary converted by pre-setting the threshold value to reduce the feature subset's size and improve the classification effect. To verify the effectiveness of the OGGWO algorithm, 18 international standard datasets were selected, and compare the OGGWO algorithm with the improved Grey Wolf algorithm and the popular metaheuristic algorithm for the fitness value comparison test and the simulation comparison experiment for classification accuracy. The results show that: (1) the OGGWO algorithm has good convergence and high search accuracy on all 18 test data; (2) the improved strategy of the OGGWO algorithm can effectively improve the classification accuracy and reduce the number of selected features compared with the traditional GWO algorithm in the classification accuracy simulation. The experimental results show that the superiority and robustness of the OGGWO algorithm in feature selection are verified. [Display omitted] • A local opposition-learning method is proposed for population initialization. • Golden sine and the grey wolf optimization algorithm improve the global exploration and local exploitation. • Evaluation of proposed method on 18 benchmark datasets. • Proposed method outperforms other feature selection algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15684946
Volume :
142
Database :
Supplemental Index
Journal :
Applied Soft Computing
Publication Type :
Academic Journal
Accession number :
163892466
Full Text :
https://doi.org/10.1016/j.asoc.2023.110319