Back to Search Start Over

Low-Discrepancy Points for Deterministic Assignment of Hidden Weights in Extreme Learning Machines.

Authors :
Cervellera, Cristiano
Maccio, Danilo
Source :
IEEE Transactions on Neural Networks & Learning Systems; Apr2016, Vol. 27 Issue 4, p891-896, 6p
Publication Year :
2016

Abstract

The traditional extreme learning machine (ELM) approach is based on a random assignment of the hidden weight values, while the linear coefficients of the output layer are determined analytically. This brief presents an analysis based on geometric properties of the sampling points used to assign the weight values, investigating the replacement of random generation of such values with low-discrepancy sequences (LDSs). Such sequences are a family of sampling methods commonly employed for numerical integration, yielding a more efficient covering of multidimensional sets with respect to random sequences, without the need for any computationally intensive procedure. In particular, we prove that the universal approximation property of the ELM is guaranteed when LDSs are employed, and how an efficient covering affects the convergence positively. Furthermore, since LDSs are generated deterministically, the results do not have a probabilistic nature. Simulation results confirm, in practice, the good theoretical properties given by the combination of ELM with LDSs. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
27
Issue :
4
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
113872486
Full Text :
https://doi.org/10.1109/TNNLS.2015.2424999