Back to Search
Start Over
Bistable Sigmoid Networks
- Source :
- Advances in Computational Intelligence ISBN: 9783030205171, IWANN (2)
- Publication Year :
- 2019
- Publisher :
- Springer International Publishing, 2019.
-
Abstract
- It is commonly known that Hopfield Networks suffer from spurious states and from low storage capacity. To eliminate the spurious states Bistable Gradient Networks (BGN) introduce neurons with bistable behavior. The weights in BGN are calculated in analogy to those of Hopfield Networks, associated with Hebbian learning. Unfortunately, those networks still suffer from small storage capacity, resulting in high reconstruction errors when used to reconstruct noisy patterns. This paper proposes a new type of neural network consisting of neurons with a sigmoid hyperbolic tangent transfer function and a direct feedback. The feedback renders the neuron bistable. Furthermore, instead of using Hebbian learning which has some drawbacks when applied to overlapped patterns, we use the first order Contrastive Divergence (CD1) learning rule. We call these Networks Bistable Sigmoid Networks (BSN). When recalling patterns from the MNIST database the reconstruction error is zero even for high load providing no noise is applied. For an increasing noise level or an increasing amount of patterns the error rises only moderate.
Details
- ISBN :
- 978-3-030-20517-1
- ISBNs :
- 9783030205171
- Database :
- OpenAIRE
- Journal :
- Advances in Computational Intelligence ISBN: 9783030205171, IWANN (2)
- Accession number :
- edsair.doi...........b448b8a7ddb3ac6d38fcf792e35fe8cf