Back to Search Start Over

Analysis of Hebbian Models with Lateral Weight Connections.

Authors :
Hutchison, David
Kanade, Takeo
Kittler, Josef
Kleinberg, Jon M.
Mattern, Friedemann
Mitchell, John C.
Naor, Moni
Nierstrasz, Oscar
Pandu Rangan, C.
Steffen, Bernhard
Sudan, Madhu
Terzopoulos, Demetri
Tygar, Doug
Vardi, Moshe Y.
Weikum, Gerhard
Sandoval, Francisco
Prieto, Alberto
Cabestany, Joan
Graña, Manuel
Zufiria, Pedro J.
Source :
Computational & Ambient Intelligence; 2007, p39-46, 8p
Publication Year :
2007

Abstract

In this paper, the behavior of some hebbian artificial neural networks with lateral weights is analyzed. Hebbian neural networks are employed in communications and signal processing applications for implementing on-line Principal Component Analysis (PCA). Different improvements over the original Oja model have been developed in the last two decades. Among them, models with lateral weights have been designed to directly provide the eigenvectors of the correlation matrix [1,5,6,9]. The behavior of hebbian models has been traditionally studied by resorting to an associated continuous-time formulation under some questionable assumptions which are not guaranteed in real implementations. In this paper we employ the alternative deterministic discrete-time (DDT) formulation that characterizes the average evolution of these nets and gathers the influence of the learning gains time evolution [12]. The dynamic behavior of some of these hebbian models is analytically characterized in this context and several simulations complement this comparative study. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISBNs :
9783540730064
Database :
Complementary Index
Journal :
Computational & Ambient Intelligence
Publication Type :
Book
Accession number :
33147682
Full Text :
https://doi.org/10.1007/978-3-540-73007-1_6