501. General Potential Surfaces and Neural Networks.
- Author
-
BROWN UNIV PROVIDENCE RI CENTER FOR NEURAL SCIENCE, Dembo,Amir, Zeitouni,Ofer, BROWN UNIV PROVIDENCE RI CENTER FOR NEURAL SCIENCE, Dembo,Amir, and Zeitouni,Ofer
- Abstract
Investigating Hopfield's model of associative memory implementation by a neural network, led to a generalized potential system with a much superior performance as an associative memory. In particular, there are no spurious memories, and any set of desired points can be stored, with unlimited capacity (in the continuous time and real space version of the model). There are no limit cycles in this system, and the size of all basins of attraction can reach up to half the distance between the stored points, by proper choice of the design parameters. A discrete time version with its state space being the unit hypercube is also derived, and admits superior properties compared to the corresponding Hopfield network. In particular the capacity of any system of N neurons, with a fixed desired size of basins of attractions, is exponentially growing with N and is asymptotically optimal in the information theory sense. The computational complexity of this model is slightly larger than that of the Hopfield memory, but of the same order. The results are derived under an axiomatic approach which determines the desired properties and shows that the above mentioned model is the only one to achieve them., Supersedes AD-A181 933.
- Published
- 1987