Back to Search
Start Over
Transputers and neural networks: an analysis of implementation constraints and performance
- Source :
- IEEE transactions on neural networks. 4(2)
- Publication Year :
- 1993
-
Abstract
- A performance analysis is presented that focuses on the achievable speedup of a neural network implementation and on the optimal size of a processor network (transputers or multicomputers that communicate in a comparable manner). For fully and randomly connected neural networks the topology of the processor network can only have a small, constant effect on the iteration time. With randomly connected neural networks, even severely limiting node fan-in has only a negligible effect on decreasing the communication overhead. The class of modular neural networks is studied as a separate case which is shown to have better implementation characteristics. On the basis of implementation constraints, it is argued that randomly connected neural networks cannot be realistic models of the brain. >
- Subjects :
- Physical neural network
Quantitative Biology::Neurons and Cognition
Artificial neural network
Computer Networks and Communications
business.industry
Time delay neural network
Computer science
Deep learning
Distributed computing
General Medicine
Network topology
Computer Science Applications
Probabilistic neural network
Recurrent neural network
Artificial Intelligence
Cellular neural network
Artificial intelligence
Types of artificial neural networks
Stochastic neural network
business
Software
Nervous system network models
Subjects
Details
- ISSN :
- 10459227
- Volume :
- 4
- Issue :
- 2
- Database :
- OpenAIRE
- Journal :
- IEEE transactions on neural networks
- Accession number :
- edsair.doi.dedup.....e659a765287d276b0f966b3306a558f3