Back to Search Start Over

Shot-noise-limited performance of optical neural networks.

Authors :
Hayat MM
Saleh BA
Gubner JA
Source :
IEEE transactions on neural networks [IEEE Trans Neural Netw] 1996; Vol. 7 (3), pp. 700-8.
Publication Year :
1996

Abstract

The performance of neural networks for which weights and signals are modeled by shot-noise processes is considered. Examples of such networks are optical neural networks and biological systems. We develop a theory that facilitates the computation of the average probability of error in binary-input/binary-output multistage and recurrent networks. We express the probability of error in terms of two key parameters: the computing-noise parameter and the weight-recording-noise parameter. The former is the average number of particles per clock cycle per signal and it represents noise due to the particle nature of the signal. The latter represents noise in the weight-recording process and is the average number of particles per weight. For a fixed computing-noise parameter, the probability of error decreases with the increase in the recording-noise parameter and saturates at a level limited by the computing-noise parameter. A similar behavior is observed when the role of the two parameters is interchanged. As both parameters increase, the probability of error decreases to zero exponentially fast at a rate that is determined using large deviations. We show that the performance can be optimized by a selective choice of the nonlinearity threshold levels. For recurrent networks, as the number of iterations increases, the probability of error increases initially and then saturates at a level determined by the stationary distribution of a Markov chain.

Details

Language :
English
ISSN :
1045-9227
Volume :
7
Issue :
3
Database :
MEDLINE
Journal :
IEEE transactions on neural networks
Publication Type :
Academic Journal
Accession number :
18263466
Full Text :
https://doi.org/10.1109/72.501727