Back to Search Start Over

A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices

Authors :
Liu, Yiguang
You, Zhisheng
Cao, Liping
Source :
Computers & Mathematics with Applications. Jan2007, Vol. 53 Issue 1, p41-53. 13p.
Publication Year :
2007

Abstract

As the efficient calculation of eigenpairs of a matrix, especially, a general real matrix, is significant in engineering, and neural networks run asynchronously and can achieve high performance in calculation, this paper introduces a recurrent neural network (RNN) to extract some eigenpair. The RNN, whose connection weights are dependent upon the matrix, can be transformed into a complex differential system whose variable is a complex vector. By the analytic expression of , the convergence properties of the RNN are analyzed in detail. With general nonzero initial complex vector, the RNN obtains the largest imaginary part of all eigenvalues. By a rearrangement of connection matrix, the largest real part is obtained. A practice of a 7×7 matrix indicates the validity of this method. Two matrices, whose dimensionalities are 50 and 100, respectively, are employed to test the efficiency of this approach when dimension number becomes large. The results imply that the iteration number at which the network enters into equilibrium state is not sensitive with dimensionality. This RNN can be used to estimate the largest modulus of eigenvalues, etc. Compared with other neural networks designed for the similar aims, this RNN is applicable to general real matrices. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
08981221
Volume :
53
Issue :
1
Database :
Academic Search Index
Journal :
Computers & Mathematics with Applications
Publication Type :
Academic Journal
Accession number :
24710284
Full Text :
https://doi.org/10.1016/j.camwa.2006.09.004