1. Natural gradient learning algorithms for RBF networks
- Author
-
Chi Zhang, Weili Guo, Weiling Li, Kanjian Zhang, Haikun Wei, and Junsheng Zhao
- Subjects
Cognitive Neuroscience ,Gaussian ,Activation function ,Probability density function ,symbols.namesake ,Nonlinear system ,Function approximation ,Arts and Humanities (miscellaneous) ,symbols ,Radial basis function ,Fisher information ,Gradient descent ,Algorithm ,Mathematics - Abstract
Radial basis function (RBF) networks are one of the most widely used models for function approximation and classification. There are many strange behaviors in the learning process of RBF networks, such as slow learning speed and the existence of the plateaus. The natural gradient learning method can overcome these disadvantages effectively. It can accelerate the dynamics of learning and avoid plateaus. In this letter, we assume that the probability density function (pdf) of the input and the activation function are gaussian. First, we introduce natural gradient learning to the RBF networks and give the explicit forms of the Fisher information matrix and its inverse. Second, since it is difficult to calculate the Fisher information matrix and its inverse when the numbers of the hidden units and the dimensions of the input are large, we introduce the adaptive method to the natural gradient learning algorithms. Finally, we give an explicit form of the adaptive natural gradient learning algorithm and compare it to the conventional gradient descent method. Simulations show that the proposed adaptive natural gradient method, which can avoid the plateaus effectively, has a good performance when RBF networks are used for nonlinear functions approximation.
- Published
- 2014