Back to Search Start Over

Progressive Semisupervised Learning of Multiple Classifiers

Authors :
Hau-San Wong
Yide Wang
Zhiwen Yu
Jun Zhang
Jane You
Guoqiang Han
Ye Lu
Southern University of Science and Technology [Shenzhen] (SUSTech)
Department of computing
City University of Hong Kong [Hong Kong] (CUHK)
Institut d'Électronique et des Technologies du numéRique (IETR)
Nantes Université (NU)-Université de Rennes 1 (UR1)
Université de Rennes (UNIV-RENNES)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées - Rennes (INSA Rennes)
Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)
61472145, NSFC
7004674, City University of Hong Kong
G-YM05, Hong Kong Polytechnic University
S2013050014677, Guangdong Natural Science Funds for Distinguished Young Scholars
2016B090918042, Science and Technology Planning Project of Guangdong Province, China
CityU 11300715, Research Grants Council of the Hong Kong Special Administrative Region, China
152202/14E, Hong Kong General Research
Southern University of Science and Technology (SUSTech)
Université de Nantes (UN)-Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes)
Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)
Université de Nantes (UN)-Université de Rennes 1 (UR1)
Source :
IEEE Transactions on Cybernetics, IEEE Transactions on Cybernetics, IEEE, 2018, 48 (2), pp.689-702. ⟨10.1109/TCYB.2017.2651114⟩, IEEE Transactions on Cybernetics, 2018, 48 (2), pp.689-702. ⟨10.1109/TCYB.2017.2651114⟩
Publication Year :
2017

Abstract

International audience; Semisupervised learning methods are often adopted to handle datasets with very small number of labeled samples. However, conventional semisupervised ensemble learning approaches have two limitations: 1) most of them cannot obtain satisfactory results on high dimensional datasets with limited labels and 2) they usually do not consider how to use an optimization process to enlarge the training set. In this paper, we propose the progressive semisupervised ensemble learning approach (PSEMISEL) to address the above limitations and handle datasets with very small number of labeled samples. When compared with traditional semisupervised ensemble learning approaches, PSEMISEL is characterized by two properties: 1) it adopts the random subspace technique to investigate the structure of the dataset in the subspaces and 2) a progressive training set generation process and a self evolutionary sample selection process are proposed to enlarge the training set. We also use a set of nonparametric tests to compare different semisupervised ensemble learning methods over multiple datasets. The experimental results on 18 real-world datasets from the University of California, Irvine machine learning repository show that PSEMISEL works well on most of the real-world datasets, and outperforms other state-of-the-art approaches on 10 out of 18 datasets.

Details

ISSN :
21682275 and 21682267
Volume :
48
Issue :
2
Database :
OpenAIRE
Journal :
IEEE transactions on cybernetics
Accession number :
edsair.doi.dedup.....3f73092be1e3092723dca31f93a43e35
Full Text :
https://doi.org/10.1109/TCYB.2017.2651114⟩