1. Feature gene selection based on fuzzy neighborhood joint entropy.
- Author
-
Wang, Yan, Sun, Minjie, Long, Linbo, Liu, Jinhui, and Ren, Yifan
- Subjects
FEATURE selection ,ENTROPY ,NEIGHBORHOODS ,GENE expression profiling ,ROUGH sets ,FUZZY algorithms - Abstract
This paper addresses the feature selection problem for gene expression profiles. Feature selection based on fuzzy neighborhood rough sets is very important in gene expression profiles. The process of extracting key features in gene expression profiles may have two disadvantages: (1) it may generate many redundant features, thus reducing the classification accuracy, and (2) it may ignore some information. To address the above problems, this paper proposes a fuzzy neighborhood joint entropy model based on feature selection; this model adopts the nonnegative principle of fuzzy neighborhood joint entropy to evaluate the importance of a candidate feature gene. Based on the fuzzy neighborhood joint entropy model, this paper proposes a new feature gene selection algorithm: the fuzzy neighborhood joint entropy (FNJE) algorithm. In the model, first, fuzzy neighborhood particles and fuzzy decision-making are combined with the uncertainty measure of joint entropy to construct a fuzzy neighborhood joint entropy model. Second, the importance degree of a feature is introduced as the measurement standard of candidate feature genes to evaluate the importance of each feature. Third, the algorithm uses the importance of candidate features in selecting features to reduce the redundancy of the selected features and improve the classification accuracy. In this paper, based on the UCI and gene datasets, we conduct a series of experiments. The experimental results show that the proposed algorithm can select fewer feature genes and achieve higher classification accuracy. Compared with the other four algorithms, our proposed algorithm can improve the accuracy by 0.4 % - 10.42 % and 1.16 % - 15.18 % and can reach maximum accuracy values of 90.52 % and 87.02 % with the linear-SVM and KNN classifiers, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF