Back to Search Start Over

General Potential Surfaces and Neural Networks.

Authors :
BROWN UNIV PROVIDENCE RI CENTER FOR NEURAL SCIENCE
Dembo,Amir
Zeitouni,Ofer
BROWN UNIV PROVIDENCE RI CENTER FOR NEURAL SCIENCE
Dembo,Amir
Zeitouni,Ofer
Source :
DTIC AND NTIS
Publication Year :
1987

Abstract

Investigating Hopfield's model of associative memory implementation by a neural network, led to a generalized potential system with a much superior performance as an associative memory. In particular, there are no spurious memories, and any set of desired points can be stored, with unlimited capacity (in the continuous time and real space version of the model). There are no limit cycles in this system, and the size of all basins of attraction can reach up to half the distance between the stored points, by proper choice of the design parameters. A discrete time version with its state space being the unit hypercube is also derived, and admits superior properties compared to the corresponding Hopfield network. In particular the capacity of any system of N neurons, with a fixed desired size of basins of attractions, is exponentially growing with N and is asymptotically optimal in the information theory sense. The computational complexity of this model is slightly larger than that of the Hopfield memory, but of the same order. The results are derived under an axiomatic approach which determines the desired properties and shows that the above mentioned model is the only one to achieve them.<br />Supersedes AD-A181 933.

Details

Database :
OAIster
Journal :
DTIC AND NTIS
Notes :
text/html, English
Publication Type :
Electronic Resource
Accession number :
edsoai.ocn831574193
Document Type :
Electronic Resource