Back to Search Start Over

Extreme ensemble of extreme learning machines.

Authors :
Mansoori, Eghbal G.
Sara, Massar
Source :
Statistical Analysis & Data Mining. Apr2021, Vol. 14 Issue 2, p116-128. 13p.
Publication Year :
2021

Abstract

Extreme learning machine (ELM) has attracted attentions in pattern classification problems due to its preferences in low computations and high generalization. To overcome its drawbacks, caused by the randomness of input weights and biases, the ensemble of ELMs was proposed. The diversity of ELMs forming the ensemble was studied broadly in the literature, via using different activation functions and/or different number of hidden neurons. However, less attention was paid to aggregation mechanism in ensemble of ELMs. To speed up this aggregation process, we propose an ensemble framework for ELMs, called extreme ensemble of ELMs (EEoELMs) because of its extreme speed in ensemble process. In this framework, the input weights of each ELM are randomly pre‐assigned as usual. The ELMs make use of the same/distinct activation functions to increase the diversity of classifiers and so the generalization of ensemble. The output weights of each ELM are set using Moore–Penrose inverse method. However, the aggregation mechanism in EEoELM is novel. Instead of using majority/weighted voting on the prediction results of ELMs, their output neurons are combined in a new decision/ensemble layer. The output of this layer determines the ensemble output. As the weights of this decision layer are also computed using Moore–Penrose inverse method, the ensemble is extremely fast. Experimental results on synthetic and real‐world datasets indicate the acceptable classification performance of EEoELM in much less computational efforts. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*MACHINE learning
*GENERALIZATION

Details

Language :
English
ISSN :
19321864
Volume :
14
Issue :
2
Database :
Academic Search Index
Journal :
Statistical Analysis & Data Mining
Publication Type :
Academic Journal
Accession number :
149246457
Full Text :
https://doi.org/10.1002/sam.11493