Back to Search
Start Over
An extended back-propagation learning algorithm by using heterogeneous processing units
- Source :
- [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.
- Publication Year :
- 2003
- Publisher :
- IEEE, 2003.
-
Abstract
- Based on the idea of using heterogeneous processing units (PUs) in a network, a variation of the backpropagation (BP) learning algorithm is presented. Three parameters, which are adjustable like connection weights, are incorporated into each PU to increase its autonomous capability by enhancing the output function. The extended BP learning algorithm thus is developed by updating the three parameters as well as connection weights. The extended BP is intended not only to improve the learning speed, but also to reduce the occurrence of local minima. The algorithm has been intensively tested on the XOR problem. By carefully choosing learning rates, results show that the extended BP appears to have advantages over the standard BP in terms of faster learning speed and fewer local minima. >
Details
- Database :
- OpenAIRE
- Journal :
- [Proceedings 1992] IJCNN International Joint Conference on Neural Networks
- Accession number :
- edsair.doi...........d86cc77c4f0232252951c298ab6c27e1
- Full Text :
- https://doi.org/10.1109/ijcnn.1992.227071