Back to Search Start Over

Enhancing performance of the backpropagation algorithm via sparse response regularization.

Authors :
Zhang, Jiangshe
Ji, Nannan
Liu, Junmin
Pan, Jiyuan
Meng, Deyu
Source :
Neurocomputing. Apr2015, Vol. 153, p20-40. 21p.
Publication Year :
2015

Abstract

The backpropagation (BP) algorithm is the most commonly utilized training strategy for a feed-forward artificial neural network (FFANN). The BP algorithm, however, always leads to the problems of low convergence rate, high energy and poor generalization capability of FFANN. In this paper, motivated by the sparsity property of human neuron’ responses, we introduce a new sparse-response BP (SRBP) to improve the capacity of a FFANN by enforcing sparsity to its hidden units through imposing a supplemental L 1 term on them. The FFANN model learned from our algorithm is closely related to the real human and thus its mechanism fully complies with the human nervous system, i.e., sparse representation and architectural depth. Experiments on several datasets demonstrate that SRBP yields good performances on convergence rate, energy saving and generalization capability. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
153
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
100850161
Full Text :
https://doi.org/10.1016/j.neucom.2014.11.055