Back to Search
Start Over
Error Weighting in Artificial Neural Networks Learning Interpreted as a Metaplasticity Model.
- Source :
- Bio-inspired Modeling of Cognitive Tasks; 2007, p244-252, 9p
- Publication Year :
- 2007
-
Abstract
- Many Artificial Neural Networks design algorithms or learning methods imply the minimization of an error objective function. During learning, weight values are updated following a strategy that tends to minimize the final mean error in the Network performance. Weight values are classically seen as a representation of the synaptic weights in biological neurons and their ability to change its value could be interpreted as artificial plasticity inspired by this biological property of neurons. In such a way, metaplasticity is interpreted in this paper as the ability to change the efficiency of artificial plasticity giving more relevance to weight updating of less frequent activations and resting relevance to frequent ones. Modeling this interpretation in the training phase, the hypothesis of an improved training is tested in the Multilayer Perceptron with Backpropagation case. The results show a much more efficient training maintaining the Artificial Neural Network performance. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISBNs :
- 9783540730521
- Database :
- Supplemental Index
- Journal :
- Bio-inspired Modeling of Cognitive Tasks
- Publication Type :
- Book
- Accession number :
- 33214118
- Full Text :
- https://doi.org/10.1007/978-3-540-73053-8_24