113 results on '"Désiré Bollé"'
Search Results
2. Self-control dynamics for sparsely coded networks with synaptic noise.
- Author
-
Désiré Bollé and Rob Heylen
- Published
- 2004
- Full Text
- View/download PDF
3. Coupled Simulated Annealing.
- Author
-
Samuel Xavier de Souza, Johan A. K. Suykens, Joos Vandewalle, and Désiré Bollé
- Published
- 2010
- Full Text
- View/download PDF
4. Adaptive Thresholds for Neural Networks with Synaptic Noise.
- Author
-
Désiré Bollé and Rob Heylen
- Published
- 2007
- Full Text
- View/download PDF
5. Mutual information of sparsely coded associative memory with self-control and ternary neurons.
- Author
-
Désiré Bollé, David Renato C. Dominguez-Carreta, and Shun-ichi Amari
- Published
- 2000
- Full Text
- View/download PDF
6. Spectra of sparse regular graphs with loops
- Author
-
F. L. Metz, I. Neri, and Désiré Bollé
- Published
- 2011
7. Paralell dynamics of extremely diluted neural networks.
- Author
-
Désiré Bollé, B. Vinck, and A. Zagrebnov
- Published
- 1993
8. Mixture states in Potts neural networks.
- Author
-
Désiré Bollé and J. Huyghebaert
- Published
- 1993
9. Optimal Nonlinear Training in the Multi-Class Proximity Problem.
- Author
-
Désiré Bollé, G. Jongen, and G. M. Shim
- Published
- 1996
10. Synchronous versus sequential updating in the three-state Ising neural network with variable dilution
- Author
-
Désiré Bollé, Toni Verbeiren, and R. Erichsen
- Subjects
Statistics and Probability ,Statistical Mechanics (cond-mat.stat-mech) ,Artificial neural network ,Replica ,Generating function ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,State (functional analysis) ,Extension (predicate logic) ,Condensed Matter - Disordered Systems and Neural Networks ,Condensed Matter Physics ,Combinatorics ,Flow (mathematics) ,Ising model ,Statistical physics ,Condensed Matter - Statistical Mechanics ,Mathematics ,Variable (mathematics) - Abstract
The three-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. Capacity-temperature phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. The results are compared with those for sequential updating. The effect of self-coupling is established. Also the dynamics is studied using the generating function technique for both synchronous and sequential updating. Typical flow diagrams for the overlap order parameter are presented. The differences with the signal-to-noise approach are outlined., Comment: 21 pages Latex, 12 eps figures and 1 ps figure
- Published
- 2006
- Full Text
- View/download PDF
11. The Blume-Emery-Griffiths neural network with synchronous updating and variable dilution
- Author
-
J. Busquets Blanco and Désiré Bollé
- Subjects
Artificial neural network ,Replica ,Complex system ,Mutual information ,Condensed Matter Physics ,Electronic, Optical and Magnetic Materials ,Dilution ,symbols.namesake ,Mean field theory ,symbols ,Optimal combination ,Statistical physics ,Hamiltonian (quantum mechanics) ,Mathematics - Abstract
The thermodynamic and retrieval properties of the Blume-Emery-Griffiths neural network with synchronous updating and variable dilution are studied using replica mean-field theory. Several forms of dilution are allowed by pruning the different types of couplings present in the Hamiltonian. The appearance and properties of two-cycles are discussed. Capacity-temperature phase diagrams are derived for several values of the pattern activity. The results are compared with those for sequential updating. The effect of self-coupling is studied. Furthermore, the optimal combination of dilution parameters giving the largest critical capacity is obtained.
- Published
- 2005
- Full Text
- View/download PDF
12. Two-cycles in spin-systems: sequential versus synchronous updating in multi-state Ising-type ferromagnets
- Author
-
J. Busquets Blanco and Désiré Bollé
- Subjects
Physics ,Spin glass ,Spins ,Complex system ,Ising model ,Statistical physics ,Type (model theory) ,Condensed Matter Physics ,Condensed Matter::Disordered Systems and Neural Networks ,Symmetry (physics) ,Electronic, Optical and Magnetic Materials ,Phase diagram ,Spin-½ - Abstract
Hamiltonians for general multi-state spin-glass systems with Ising symmetry are derived for both sequential and synchronous updating of the spins. The possibly different behaviour caused by the way of updating is studied in detail for the (anti)-ferromagnetic version of the models, which can be solved analytically without any approximation, both thermodynamically via a free-energy calculation and dynamically using the generating functional approach. Phase diagrams are discussed and the appearance of two-cycles in the case of synchronous updating is examined. A comparative study is made for the Q-Ising and the Blume-Emery-Griffiths ferromagnets and some interesting physical differences are found. Numerical simulations confirm the results obtained.
- Published
- 2004
- Full Text
- View/download PDF
13. A layered neural network with three-state neurons optimizing the mutual information
- Author
-
W. K. Theumann, R. Erichsen, and Désiré Bollé
- Subjects
Statistics and Probability ,Theoretical computer science ,Statistical Mechanics (cond-mat.stat-mech) ,Artificial neural network ,Computer science ,Time evolution ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Mutual information ,Condensed Matter - Disordered Systems and Neural Networks ,Fixed point ,Condensed Matter Physics ,Topology ,Quantitative Biology ,Synaptic noise ,Flow (mathematics) ,FOS: Biological sciences ,Feedforward neural network ,Condensed Matter - Statistical Mechanics ,Quantitative Biology (q-bio) ,Network model - Abstract
The time evolution of an exactly solvable layered feedforward neural network with three-state neurons and optimizing the mutual information is studied for arbitrary synaptic noise (temperature). Detailed stationary temperature-capacity and capacity-activity phase diagrams are obtained. The model exhibits pattern retrieval, pattern-fluctuation retrieval and spin-glass phases. It is found that there is an improved performance in the form of both a larger critical capacity and information content compared with three-state Ising-type layered network models. Flow diagrams reveal that saddle-point solutions associated with fluctuation overlaps slow down considerably the flow of the network states towards the stable fixed-points., Comment: 17 pages Latex including 6 eps-figures
- Published
- 2004
- Full Text
- View/download PDF
14. A spherical Hopfield model
- Author
-
Toni Verbeiren, Désiré Bollé, Th. M. Nieuwenhuizen, I Pérez Castillo, and Quantum Condensed Matter Theory (ITFA, IoP, FNWI)
- Subjects
Physics ,Statistical Mechanics (cond-mat.stat-mech) ,Quantitative Biology::Neurons and Cognition ,Artificial neural network ,Closed set ,Replica ,FOS: Physical sciences ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Quantitative Biology ,Continuous variable ,symbols.namesake ,FOS: Biological sciences ,Quartic function ,symbols ,Statistical physics ,Hamiltonian (quantum mechanics) ,Langevin dynamics ,Quantitative Biology (q-bio) ,Condensed Matter - Statistical Mechanics ,Mathematical Physics - Abstract
We introduce a spherical Hopfield-type neural network involving neurons and patterns that are continuous variables. We study both the thermodynamics and dynamics of this model. In order to have a retrieval phase a quartic term is added to the Hamiltonian. The thermodynamics of the model is exactly solvable and the results are replica symmetric. A Langevin dynamics leads to a closed set of equations for the order parameters and effective correlation and response function typical for neural networks. The stationary limit corresponds to the thermodynamic results. Numerical calculations illustrate our findings., 9 pages Latex including 3 eps figures, Addition of an author in the HTML-abstract unintentionally forgotten, no changes to the manuscript
- Published
- 2003
15. On-line learning and generalization in coupled perceptrons
- Author
-
Désiré Bollé and P. Kozłowski
- Subjects
Computer Science::Machine Learning ,Generalization ,Computer science ,business.industry ,Supervised learning ,FOS: Physical sciences ,Physics::Physics Education ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Perceptron ,Condensed Matter::Disordered Systems and Neural Networks ,Generalization error ,Term (time) ,Simple (abstract algebra) ,Learning curve ,Line (geometry) ,Artificial intelligence ,business ,Mathematical Physics - Abstract
We study supervised learning and generalisation in coupled perceptrons trained on-line using two learning scenarios. In the first scenario the teacher and the student are independent networks and both are represented by an Ashkin-Teller perceptron. In the second scenario the student and the teacher are simple perceptrons but are coupled by an Ashkin-Teller type four-neuron interaction term. Expressions for the generalisation error and the learning curves are derived for various learning algorithms. The analytic results find excellent confirmation in numerical simulations., Latex, 21 pages, 9 figures, iop style files included
- Published
- 2002
- Full Text
- View/download PDF
16. From shrinking to percolation in an optimization model
- Author
-
J. van Mourik, K. Y. Michael Wong, and Désiré Bollé
- Subjects
Discrete mathematics ,Signal processing ,Noise (signal processing) ,Noise reduction ,Mathematical analysis ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Space (mathematics) ,Signal ,Symmetry (physics) ,Constraint (information theory) ,Percolation ,Mathematical Physics ,Mathematics - Abstract
A model of noise reduction for signal processing and other optimization tasks is introduced. Each noise source puts a symmetric constraint on the space of the signal vector within a tolerance bound. When the number of noise sources increases, sequences of transitions take place, causing the solution space to vanish. We find that the transition from an extended solution space to a shrunk space is retarded because of the symmetry of the constraints, in contrast with the analogous problem of pattern storage. For low tolerance, the solution space vanishes by volume reduction, whereas for high tolerance, the vanishing becomes more and more like percolation.
- Published
- 2000
- Full Text
- View/download PDF
17. Self-Control in Sparsely Coded Networks
- Author
-
Drc Dominguez and Désiré Bollé
- Subjects
Statistical Mechanics (cond-mat.stat-mech) ,Artificial neural network ,Time delay neural network ,Computer science ,business.industry ,FOS: Physical sciences ,General Physics and Astronomy ,Pattern recognition ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Mutual information ,Function (mathematics) ,Condensed Matter - Disordered Systems and Neural Networks ,Content-addressable memory ,Quantitative biology ,Quantitative Biology ,Noise ,FOS: Biological sciences ,Artificial intelligence ,business ,Condensed Matter - Statistical Mechanics ,Quantitative Biology (q-bio) - Abstract
A complete self-control mechanism is proposed in the dynamics of neural networks through the introduction of a time-dependent threshold, determined in function of both the noise and the pattern activity in the network. Especially for sparsely coded models this mechanism is shown to considerably improve the storage capacity, the basins of attraction and the mutual information content of the network., Comment: 4 pages, 6 Postscript figures
- Published
- 1998
- Full Text
- View/download PDF
18. [Untitled]
- Author
-
G Jongen, Désiré Bollé, and G. M. Shim
- Subjects
Set (abstract data type) ,Theoretical computer science ,Artificial neural network ,Distribution (number theory) ,Time evolution ,Probabilistic logic ,Statistical and Nonlinear Physics ,Ising model ,Statistical physics ,Content-addressable memory ,Local field ,Mathematical Physics ,Mathematics - Abstract
Using a probabilistic approach, the parallel dynamics of fully connected Q-Ising neural networks is studied for arbitrary Q. A novel recursive scheme is set up to determine the time evolution of the order parameters through the evolution of the distribution of the local field, taking into account all feedback correlations. In contrast to extremely diluted and layered network architectures, the local field is no longer normally distributed but contains a discrete part. As an illustrative example, an explicit analysis is carried out for the first four time steps. For the case of the Q = 2 and Q = 3 model the results are compared with extensive numerical simulations and excellent agreement is found. Finally, equilibrium fixed-point equations are derived and compared with the thermodynamic approach based upon the replica-symmetric mean-field approximation.
- Published
- 1998
- Full Text
- View/download PDF
19. Dynamics of temporal activity in multi-state neural networks
- Author
-
K Y M Wong, Gyoung Moo Shim, and Désiré Bollé
- Subjects
Multi state ,Artificial neural network ,Computer science ,business.industry ,Time delay neural network ,Pattern reconstruction ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Pattern recognition ,computer.software_genre ,Clipping (photography) ,Dynamics (music) ,Artificial intelligence ,Data mining ,business ,computer ,Mathematical Physics - Abstract
We consider the behaviour of multi-state neural networks averaged over an extended monitoring period of their dynamics. Pattern reconstruction by clipping the activities is proposed, leading to an improvement in retrieval precision.
- Published
- 1997
- Full Text
- View/download PDF
20. OPTIMAL NONLINEAR TRAINING IN THE MULTI-CLASS PROXIMITY PROBLEM
- Author
-
G. M. Shim, G Jongen, and Désiré Bollé
- Subjects
Theoretical computer science ,Computer Networks and Communications ,Generalization ,Gaussian ,Hyperbolic function ,Binary number ,General Medicine ,Perceptron ,Linear function ,Nonlinear system ,symbols.namesake ,Hebbian theory ,Nonlinear Dynamics ,symbols ,Learning ,Applied mathematics ,Neural Networks, Computer ,Mathematics - Abstract
Using a signal-to-noise analysis, the effects of nonlinear modulation of the Hebbian learning rule in the multi-class proximity problem are investigated. Both random classification and classification provided by a Gaussian and a binary teacher are treated. Analytic expressions are derived for the learning and generalization rates around an old and a new prototype. For the proximity problem with binary inputs but Q′-state outputs, it is shown that the optimal modulation is a combination of a hyperbolic tangent and a linear function. As an illustration, numerical results are presented for the two-class and the Q′=3 multi-class problem.
- Published
- 1996
- Full Text
- View/download PDF
21. On the dynamics of analogue neurons with nonsigmoidal gain functions
- Author
-
B Vinck and Désiré Bollé
- Subjects
Statistics and Probability ,Coupling ,Piecewise linear function ,Zero state response ,Hebbian theory ,Control theory ,Attractor ,Chaotic ,Phase (waves) ,Statistical physics ,Condensed Matter Physics ,Chaotic hysteresis ,Mathematics - Abstract
The characteristic properties of the macroscopic retrieval dynamics of analogue neurons with Hebbian coupling strengths are studied, usign the shape of the gain function as a modeling parameter. Already at low loading a rich diversity in dynamical behaviour is observed, covering the full range from point attractors to chaotic dynamics. The attractors which are not of the fixed-point-type, are interpreted as an intermediate phase between the well-known retrieval states and the zero state, enabling a waiting-mode in the system. Using a probabilistic approach it is shown that these features persist in extremely and asymmetrically diluted systems at sufficiently low loading. A number of generic examples are worked out in detail, illustrating some properties of networks governed by nonmonotonic piecewise linear gain functions. The critical storage level above which the chaotic behaviour is absent, is numerically determined.
- Published
- 1996
- Full Text
- View/download PDF
22. Nonlinear Hebbian training of the perceptron
- Author
-
Gyoung Moo Shim and Désiré Bollé
- Subjects
Artificial neural network ,business.industry ,Generalization ,Gaussian ,Hyperbolic function ,Neuroscience (miscellaneous) ,Physics::Physics Education ,Perceptron ,Nonlinear system ,symbols.namesake ,Hebbian theory ,symbols ,Artificial intelligence ,Boolean function ,business ,Algorithm ,Mathematics - Abstract
The effects of nonlinear modulation of the Hebbian learning rule on the performance of a perceptron are investigated. Both random classification and classification provided by a teacher perceptron are considered. It is seen that both the generalization and learning rate depend on the overlap between the teacher and the student and the signal-to-noise ratio in the local field. Furthermore, they are independent of the specific teacher distribution when the ratio between the number of training examples and the perceptron size is small. An analytic expression is obtained for the optimal modulation function for different classification schemes. For random and Gaussian teacher classifications the best choice for modulation appears to be linear. For binary teachers it is shown to be the hyperbolic tangent. The modifications on the latter from diluting the binary teacher are also obtained in analytic form.
- Published
- 1995
- Full Text
- View/download PDF
23. On the multi-neuron interaction model without truncating the interaction
- Author
-
J Huyghebaert, Désiré Bollé, and Gyoung Moo Shim
- Subjects
Artificial neural network ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Interaction model ,Condensed Matter::Disordered Systems and Neural Networks ,Stability (probability) ,Combinatorics ,Mean field theory ,Order (group theory) ,Statistical physics ,Zero temperature ,Mathematical Physics ,Lattice model (physics) ,Phase diagram ,Mathematics - Abstract
A replica-symmetric mean-field theory approach is presented to the multi-neuron interaction model introduced by de Almeida and Iglesias (1990 Phys. Lett. 146A 239). Fixed-point equations are derived for the relevant order parameters of the model, extended to include biased patterns, without truncating the interaction. The capacity-bias and the temperature-capacity phase diagrams are discussed. Compared with the truncated version of the model, it is found that the capacity at zero temperature is infinite and that the retrieval states satisfy the de Almeida-Thouless stability condition.
- Published
- 1994
- Full Text
- View/download PDF
24. Thermodynamic properties of fully connected Q-Ising neural networks
- Author
-
Désiré Bollé, Gyoung Moo Shim, and H Rieger
- Subjects
Artificial neural network ,media_common.quotation_subject ,General Physics and Astronomy ,Thermodynamics ,Statistical and Nonlinear Physics ,Infinity ,Perceptron ,Mean field theory ,Order (group theory) ,Ising model ,Statistical physics ,Mathematical Physics ,Gain function ,media_common ,Phase diagram ,Mathematics - Abstract
The thermodynamic and retrieval properties of fully connected Q-Ising networks are studied in the replica-symmetric mean-field approximation. In particular, capacity-gain parameter and capacity-temperature phase diagrams are derived for Q=3, 4 and Q= infinity and different distributions of the stored patterns. Furthermore, the optimal gain function is determined in order to obtain the best performance. Where appropriate, the results are compared with the diluted and layered versions of these models.
- Published
- 1994
- Full Text
- View/download PDF
25. Capacity of diluted multi-state neural networks
- Author
-
J. van Mourik and Désiré Bollé
- Subjects
Spin glass ,Condensed matter physics ,Artificial neural network ,Multi state ,Distribution (number theory) ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Condensed Matter::Disordered Systems and Neural Networks ,Dilution ,Ising model ,Line (text file) ,Biological system ,Mathematical Physics ,Mathematics - Abstract
The optimal storage capacity is studied for diluted networks with multi-state neurons and continuous respectively discrete couplings, within the replica-symmetric Gardner approach. The Gardner-Derrida line, the de Almeida-Thouless line and the zero-entropy line are compared and the validity of the replica-symmetric approximation is discussed in detail. The distribution of the synaptic couplings is determined. The results are analysed in terms of the number of states of the neuron, the distribution of the stored patterns, the amount of dilution and the number of discrete values for the couplings.
- Published
- 1994
- Full Text
- View/download PDF
26. Retrieval and chaos in extremely dilutedQ-Ising neural networks
- Author
-
B Vinck, Va Zagrebnov, Désiré Bollé, and Gyoung Moo Shim
- Subjects
CHAOS (operating system) ,Artificial neural network ,Time evolution ,Chaotic ,Probabilistic logic ,Recursion (computer science) ,Parallel dynamics ,Statistical and Nonlinear Physics ,Ising model ,Statistical physics ,Algorithm ,Mathematical Physics ,Mathematics - Abstract
Using a probabilistic approach, the deterministic and the stochastic parallel dynamics of aQ-Ising neural network are studied at finiteQ and in the limitQ→∞. Exact evolution equations are presented for the first time-step. These formulas constitute recursion relations for the parallel dynamics of the extremely diluted asymmetric versions of these networks. An explicit analysis of the retrieval properties is carried out in terms of the gain parameter, the loading capacity, and the temperature. The results for theQ→∞ network are compared with those for theQ=3 andQ=4 models. Possible chaotic microscopic behavior is studied using the time evolution of the distance between two network configurations. For arbitrary finiteQ the retrieval regime is always chaotic. In the limitQ→∞ the network exhibits a dynamical transition toward chaos.
- Published
- 1994
- Full Text
- View/download PDF
27. Spectra of sparse regular graphs with loops
- Author
-
Désiré Bollé, Fernando L. Metz, and Izaak Neri
- Subjects
FOS: Computer and information sciences ,Physics - Physics and Society ,Biophysics ,Systems Theory ,FOS: Physical sciences ,Physics and Society (physics.soc-ph) ,Spectral line ,Synchronization ,Computer Communication Networks ,Condensed Matter - Statistical Mechanics ,Mathematical Physics ,Network model ,Mathematics ,Discrete mathematics ,Social and Information Networks (cs.SI) ,Models, Statistical ,Statistical Mechanics (cond-mat.stat-mech) ,Physics ,Spectrum (functional analysis) ,Exact differential equation ,Reproducibility of Results ,Computer Science - Social and Information Networks ,Signal Processing, Computer-Assisted ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Mathematical Physics (math-ph) ,Condensed Matter - Disordered Systems and Neural Networks ,Spectral gap ,Algorithms - Abstract
We derive exact equations that determine the spectra of undirected and directed sparsely connected regular graphs containing loops of arbitrary length. The implications of our results to the structural and dynamical properties of networks are discussed by showing how loops influence the size of the spectral gap and the propensity for synchronization. Analytical formulas for the spectrum are obtained for specific length of the loops., Comment: 4 pages, 4 figures
- Published
- 2011
- Full Text
- View/download PDF
28. Mixture states and storage of biased patterns in Potts-glass neural networks
- Author
-
Désiré Bollé and J Huyghebaert
- Subjects
Infinite number ,Artificial neural network ,Learning rule ,Stability (learning theory) ,Embedding ,Statistical physics ,State (functional analysis) ,Condensed Matter::Disordered Systems and Neural Networks ,Finite set ,Mathematics - Abstract
The presence and stability of mixture states in Q-state Potts neural networks are studied for different learning rules within the replica-symmetric mean-field-theory approach. The retrieval properties of the asymmetric mixture states are examined in the case of biased patterns. For the storage of a finite number of such patterns, these properties are compared for the usual Hebb learning rule and some variants obtained by subtracting, for a certain pattern, the average of the Potts neuron state over all the other patterns. The latter are introduced to suppress the symmetric mixture states. Furthermore, the embedding of an additional, infinite number of unbiased patterns stored with the Hebb rule is allowed
- Published
- 1993
- Full Text
- View/download PDF
29. Optimal capacity of graded-response perceptrons
- Author
-
J. van Mourik, Désiré Bollé, and Reimer Kühn
- Subjects
Stability (learning theory) ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Perceptron ,Algorithm ,Mathematical Physics ,Mathematics - Abstract
Optimal capacities of perceptrons with graded input-output relations are computed within the Gardner approach (1988). The influence of desired output precision, stability with respect to input errors, and output-pattern statistics are analysed and discussed.
- Published
- 1993
- Full Text
- View/download PDF
30. On the parallel dynamics of theQ-state Potts andQ-Ising neural networks
- Author
-
B Vinck, Va Zagrebnov, and Désiré Bollé
- Subjects
Combinatorics ,Artificial neural network ,Parallel processing (DSP implementation) ,Mathematical model ,Zero (complex analysis) ,Recursion (computer science) ,Statistical and Nonlinear Physics ,Ising model ,Statistical physics ,Absolute zero ,Mathematical Physics ,Potts model ,Mathematics - Abstract
Using a probabilistic approach, the parallel dynamics of theQ-state Potts andQ-Ising neural networks are studied at zero and at nonzero temperatures. Evolution equations are derived for the first time step and arbitraryQ. These formulas constitute recursion relations for the exact parallel dynamics of the extremely diluted asymmetric versions of these networks. An explicit analysis, including dynamical capacity-temperature diagrams and the temperature dependence of the overlap, is carried out forQ=3. Both types of models are compared.
- Published
- 1993
- Full Text
- View/download PDF
31. Mean-field theory for the Q-state Potts-glass neural network with biased patterns
- Author
-
Désiré Bollé, Patrick Dupont, Ronald Cools, and J Huyghebaert
- Subjects
Artificial neural network ,Replica ,Diagram ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,State (functional analysis) ,Stability (probability) ,Quality (physics) ,Mean field theory ,Statistical physics ,Algorithm ,Mathematical Physics ,Mathematics ,Potts model - Abstract
A systematic study of the Q-state Potts model of neural networks, extended to include biased patterns, is made for extensive loading alpha . Mean-field equations are written down within the replica symmetric approximation, for general Q and arbitrary temperature T. For the Q=3 model and two classes of representative bias parameters, the storage capacity and retrieval quality at zero temperature are discussed as functions of the bias, taking into account the Mattis retrieval state and the lowest symmetric states. The T- alpha diagram is obtained and the stability properties of the retrieval state are analysed at finite temperatures. A comparison is made with the biased Hopfield model.
- Published
- 1993
- Full Text
- View/download PDF
32. Optimal storage capacity for diluted multi-state neural networks: continuous and discrete couplings
- Author
-
J. van Mourik, Patrick Dupont, and Désiré Bollé
- Subjects
Statistics and Probability ,Physics ,Multi state ,Artificial neural network ,Condensed Matter Physics ,Topology ,Algorithm ,Dilution - Abstract
Using the Gardner approach, the optimal storage capacity is considered for diluted networks with multi-state neurons and spherical or local constraints on the couplings. The first results are discussed about its dependence upon the number of states of the neuron, the amount of dilution and the number of discrete values for the couplings.
- Published
- 1992
- Full Text
- View/download PDF
33. On the phase diagram of the Q-state Potts-glass neural network
- Author
-
Désiré Bollé, J Huyghebaert, and Patrick Dupont
- Subjects
Statistics and Probability ,Spin glass ,Condensed matter physics ,Mean field theory ,Artificial neural network ,Replica ,Statistical physics ,Condensed Matter Physics ,Bifurcation diagram ,Bifurcation ,Potts model ,Phase diagram ,Mathematics - Abstract
The Q-state Potts model of neural networks, extended to include biased patterns, is studied for extensive loading α. Mean-field equations are written down for general Q and arbitrary temperature T. Within the replica symmetric approximation, the complete T-α phase diagram is discussed, especially for Q = 3. A study of the bifurcation diagrams for the overlap and the free clarifies the results obtained.
- Published
- 1992
- Full Text
- View/download PDF
34. On the overlap dynamics of multi-state neural networks with a finite number of patterns
- Author
-
Désiré Bollé, Patrick Dupont, and B Vinck
- Subjects
Lyapunov function ,Artificial neural network ,Time evolution ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Fixed point ,Stability (probability) ,Combinatorics ,symbols.namesake ,Hebbian theory ,symbols ,Statistical physics ,Absolute zero ,Finite set ,Mathematical Physics ,Mathematics - Abstract
Neural networks with multi-state neurons are studied in the case of low loading. For symmetric couplings satisfying a certain positivity condition, a Lyapunov function is shown to exist in the space of overlaps between the instantaneous microscopic state of the system and the learned patterns. Furthermore, an algorithm is derived for zero temperature to determine all the fixed points. As an illustration, the three-state model is worked out explicitly for Hebbian couplings. For finite temperature the time evolution of the overlap is studied for couplings which need not be symmetric. The stability properties are discussed in detail for the three-state model. For asymmetric couplings limit-cycle behaviour is shown to be possible.
- Published
- 1992
- Full Text
- View/download PDF
35. Thermodynamic properties of theQ-state Potts-glass neural network
- Author
-
J Huyghebaert, Patrick Dupont, and Désiré Bollé
- Subjects
Physics ,Alpha (programming language) ,Spin glass ,Tricritical point ,Artificial neural network ,State (functional analysis) ,Statistical physics ,Stability (probability) ,Atomic and Molecular Physics, and Optics ,Phase diagram ,Potts model ,Mathematical physics - Abstract
The Q-state Potts model of neural networks, extended to include biased patterns, is studied for extensive loading \ensuremath{\alpha}. Within the replica-symmetric approximation, mean-field equations are written down for general Q and arbitrary temperature T. The critical storage capacity is discussed for Q=3 and two classes of representative bias parameters. The complete T-\ensuremath{\alpha} phase diagram is presented. A tricritical point is found in the spin-glass transition for Qg6, depending on \ensuremath{\alpha}. Contrary to the Hopfield model, the critical lines do not converge to the same T as \ensuremath{\alpha}\ensuremath{\rightarrow}0. A stability analysis is made.
- Published
- 1992
- Full Text
- View/download PDF
36. The phase diagram of L\'evy spin glasses
- Author
-
Fernando L. Metz, Izaak Neri, and Désiré Bollé
- Subjects
Statistics and Probability ,Physics ,Cavity method ,Spin glass ,Gaussian ,Replica ,Statistical and Nonlinear Physics ,Condensed Matter - Disordered Systems and Neural Networks ,Condensed Matter::Disordered Systems and Neural Networks ,symbols.namesake ,Quantum mechanics ,symbols ,Symmetry breaking ,Statistics, Probability and Uncertainty ,Local field ,Spin-½ ,Ansatz - Abstract
We study the L\'evy spin-glass model with the replica and the cavity method. In this model each spin interacts through a finite number of strong bonds and an infinite number of weak bonds. This hybrid behaviour of L\'evy spin glasses becomes transparent in our solution: the local field contains a part propagating along a backbone of strong bonds and a Gaussian noise term due to weak bonds. Our method allows to determine the complete replica symmetric phase diagram, the replica symmetry breaking line and the entropy. The results are compared with simulations and previous calculations using a Gaussian ansatz for the distribution of fields., Comment: 20 pages, 7 figures
- Published
- 2009
37. The Cavity Approach to Parallel Dynamics of Ising Spins on a Graph
- Author
-
Izaak Neri and Désiré Bollé
- Subjects
Statistics and Probability ,Physics ,Cavity method ,Stationary distribution ,Bethe lattice ,Artificial neural network ,FOS: Physical sciences ,Statistical and Nonlinear Physics ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Hebbian theory ,Ising model ,Statistical physics ,Statistics, Probability and Uncertainty ,Finite set ,Phase diagram - Abstract
We use the cavity method to study parallel dynamics of disordered Ising models on a graph. In particular, we derive a set of recursive equations in single site probabilities of paths propagating along the edges of the graph. These equations are analogous to the cavity equations for equilibrium models and are exact on a tree. On graphs with exclusively directed edges we find an exact expression for the stationary distribution of the spins. We present the phase diagrams for an Ising model on an asymmetric Bethe lattice and for a neural network with Hebbian interactions on an asymmetric scale-free graph. For graphs with a nonzero fraction of symmetric edges the equations can be solved for a finite number of time steps. Theoretical predictions are confirmed by simulation results. Using a heuristic method, the cavity equations are extended to a set of equations that determine the marginals of the stationary distribution of Ising models on graphs with a nonzero fraction of symmetric edges. The results of this method are discussed and compared with simulations.
- Published
- 2009
38. The Optimal Storage Capacity for a Neural Network with Multi-State Neurons
- Author
-
Désiré Bollé, Patrick Dupont, and J. van Mourik
- Subjects
Constraint (information theory) ,Multi state ,Artificial neural network ,Mean field theory ,General Physics and Astronomy ,Dynamical system ,Topology ,Stability (probability) ,Gain function ,Mathematics - Abstract
Using the Gardner approach, the optimal storage capacity is discussed for a network with multi-stage neurons and a spherical constraint on the couplings. It is found that this capacity decreases with the number of states and the corresponding separation of the plateaus of the gain function. An analysis of the validity of the replica-symmetric approximation is made.
- Published
- 1991
- Full Text
- View/download PDF
39. Stability properties of Potts neural networks with biased patterns and low loading
- Author
-
Désiré Bollé, J. van Mourik, and Patrick Dupont
- Subjects
Spin glass ,Artificial neural network ,Condensed matter physics ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Function (mathematics) ,Stability (probability) ,Mean field theory ,Statistical physics ,Zero temperature ,Finite set ,Mathematical Physics ,Mathematics ,Potts model - Abstract
The q-state Potts glass model of neural networks is extended to include biased patterns. For a finite number of such patterns, the existence and stability properties of the Mattis states and symmetric states are discussed in detail as a function of the bias. Analytic results are presented for all q at zero temperature. For finite temperatures numerical results are obtained for q=3 and two classes of representative bias parameters. A comparison is made with the Hopfield model.
- Published
- 1991
- Full Text
- View/download PDF
40. On Potts-glass neural networks with biased patterns
- Author
-
Patrick Dupont and Désiré Bollé
- Subjects
Physics ,Recurrent neural network ,Artificial neural network ,Time evolution ,State (functional analysis) ,Statistical physics ,Stochastic neural network ,Stability (probability) ,Finite set ,Equations for a falling body - Abstract
Neural networks of the q-state Potts type are considered for a finite number of biased patterns. The existence and stability properties of the Mattis states and symmetric states are discussed at zero temperature for all q and arbitrary bias and near the “critical” temperature for q = 3 and two classes of representative bias parameters. Some of these properties are illustrated through the solution of the dynamical equations governing the time evolution of the macroscopic overlap between the learned patterns and the instantaneous microscopic state of the network.
- Published
- 2008
- Full Text
- View/download PDF
41. Scattering theory methods in reacting plasmas
- Author
-
Désiré Bollé
- Subjects
Physics ,Free particle ,Grand canonical ensemble ,Virial coefficient ,Quantum electrodynamics ,Virial expansion ,Cluster coefficient ,Plasma ,Scattering theory ,Inelastic scattering - Published
- 2008
- Full Text
- View/download PDF
42. Levinson's theorems and the quantum-mechanical partition function for plasmas
- Author
-
Désiré Bollé
- Subjects
Physics ,Partition function (quantum field theory) ,Plasma ,Quantum statistical mechanics ,Quantum ,Mathematical physics - Published
- 2008
- Full Text
- View/download PDF
43. Time delay in N-body scattering
- Author
-
Désiré Bollé and T.A. Osborn
- Subjects
Physics ,Scattering ,Atomic physics - Published
- 2008
- Full Text
- View/download PDF
44. Gallager error correcting codes for binary asymmetric channels
- Author
-
Désiré Bollé, Izaak Neri, and N. S. Skantzos
- Subjects
Statistics and Probability ,Computer science ,Message passing ,FOS: Physical sciences ,Binary number ,Statistical and Nonlinear Physics ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Statistical mechanics ,Data_CODINGANDINFORMATIONTHEORY ,Condensed Matter - Disordered Systems and Neural Networks ,Error correcting ,Statistics, Probability and Uncertainty ,Algorithm ,Decoding methods ,Computer Science::Information Theory - Abstract
We derive critical noise levels for Gallager codes on asymmetric channels as a function of the input bias and the temperature. Using a statistical mechanics approach we study the space of codewords and the entropy in the various decoding regimes. We further discuss the relation of the convergence of the message passing algorithm with the endogeny property and complexity, characterizing solutions of recursive equations of distributions for cavity fields.
- Published
- 2008
45. On the general structure of supersymmetric quantum mechanical models
- Author
-
Harald Grosse, Désiré Bollé, and Patrick Dupont
- Subjects
Physics ,Nuclear and High Energy Physics ,symbols.namesake ,Quantization (physics) ,Mechanical models ,symbols ,Supersymmetry ,Quantum ,Lagrangian ,Matrix similarity ,Mathematical physics - Abstract
We require total invariance of a lagrangian under supersymmetry transformations and we observe that special variables are singled out. They are identical to those entering the Nicolai mapping. We show that a similarity transformation is connected with the introduction of these new variables. We give a stochastic formulation of this transformation using the Cameron-Martin formula.
- Published
- 1990
- Full Text
- View/download PDF
46. Small-world hypergraphs on a bond-disordered Bethe lattice
- Author
-
Désiré Bollé and Rob Heylen
- Subjects
Ising chain ,Statistical Mechanics (cond-mat.stat-mech) ,Bethe lattice ,Replica ,Complex system ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Condensed Matter::Disordered Systems and Neural Networks ,Explicit symmetry breaking ,Ferromagnetism ,Statistical physics ,Symmetry breaking ,Replica trick ,Condensed Matter - Statistical Mechanics ,Mathematics - Abstract
We study the thermodynamic properties of spin systems with bond-disorder on small-world hypergraphs, obtained by superimposing a one-dimensional Ising chain onto a random Bethe graph with p-spin interactions. Using transfer-matrix techniques, we derive fixed-point equations describing the relevant order parameters and the free energy, both in the replica symmetric and one step replica symmetry breaking approximation. We determine the static and dynamic ferromagnetic transition and the spinglass transition within replica symmetry for all temperatures, and demonstrate corrections to these results when one step replica symmetry breaking is taken into account. The results obtained are in agreement with Monte-Carlo simulations., 9 pages, 4 figures
- Published
- 2007
47. Thermodynamics of spin systems on small-world hypergraphs
- Author
-
N. S. Skantzos, Désiré Bollé, and Rob Heylen
- Subjects
Random graph ,education.field_of_study ,Statistical Mechanics (cond-mat.stat-mech) ,Population ,Complex system ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Complex network ,Condensed Matter - Disordered Systems and Neural Networks ,Poisson distribution ,Set (abstract data type) ,symbols.namesake ,symbols ,Statistical physics ,Special case ,education ,Condensed Matter - Statistical Mechanics ,Mathematics ,Spin-½ - Abstract
We study the thermodynamic properties of spin systems on small-world hypergraphs, obtained by superimposing sparse Poisson random graphs with p-spin interactions onto a one-dimensional Ising chain with nearest-neighbor interactions. We use replica-symmetric transfer-matrix techniques to derive a set of fixed-point equations describing the relevant order parameters and free energy, and solve them employing population dynamics. In the special case where the number of connections per site is of the order of the system size we are able to solve the model analytically. In the more general case where the number of connections is finite we determine the static and dynamic ferromagnetic-paramagnetic transitions using population dynamics. The results are tested against Monte-Carlo simulations., 14 pages, 7 figures; Added 2 figures. Extended results
- Published
- 2006
48. Gardner optimal capacity of the diluted Blume-Emery-Griffiths neural network
- Author
-
Désiré Bollé and I Pérez Castillo
- Subjects
Statistics and Probability ,Coupling ,Statistical Mechanics (cond-mat.stat-mech) ,Artificial neural network ,Condensed matter physics ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Condensed Matter Physics ,Condensed Matter::Disordered Systems and Neural Networks ,Dilution ,Entropy (information theory) ,Embedding ,Statistical physics ,Ternary operation ,Condensed Matter - Statistical Mechanics ,Mathematics - Abstract
The optimal capacity of a diluted Blume-Emery-Griffiths neural network is studied as a function of the pattern activity and the embedding stability using the Gardner entropy approach. Annealed dilution is considered, cutting some of the couplings referring to the ternary patterns themselves and some of the couplings related to the active patterns, both simultaneously (synchronous dilution) or independently (asynchronous dilution). Through the de Almeida-Thouless criterion it is found that the replica-symmetric solution is locally unstable as soon as there is dilution. The distribution of the couplings shows the typical gap with a width depending on the amount of dilution, but this gap persists even in cases where a particular type of coupling plays no role in the learning process., 9 pages Latex, 2 eps figures
- Published
- 2004
49. Self-control dynamics for sparsely coded networks with synaptic noise
- Author
-
Rob Heylen and Désiré Bollé
- Subjects
Theoretical computer science ,Artificial neural network ,Statistical Mechanics (cond-mat.stat-mech) ,Computer science ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Content-addressable memory ,Condensed Matter - Disordered Systems and Neural Networks ,Transfer function ,Synaptic noise ,Attractor ,Content-addressable storage ,Algorithm ,Condensed Matter - Statistical Mechanics - Abstract
For the retrieval dynamics of sparsely coded attractor associative memory models with synaptic noise the inclusion of a macroscopic time-dependent threshold is studied. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the memorized patterns, adapting itself automatically in the course of the time evolution, an autonomous functioning of the model is guaranteed. This self-control mechanism considerably improves the quality of the fixed-point retrieval dynamics, in particular the storage capacity, the basins of attraction and the mutual information content., Comment: 5 pages Latex, 1 ps and 4 eps figures, to appear in the proceedings of the 2004 International Joint Conference on Neural Networks, Budapest (IEEE)
- Published
- 2004
- Full Text
- View/download PDF
50. Multiplicative versus additive noise in multi-state neural networks
- Author
-
Toni Verbeiren, Désiré Bollé, and J. Busquets Blanco
- Subjects
Artificial neural network ,Quantitative Biology::Neurons and Cognition ,Statistical Mechanics (cond-mat.stat-mech) ,Multiplicative function ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Noise (electronics) ,Multiplicative noise ,symbols.namesake ,Hebbian theory ,Gaussian noise ,Learning rule ,symbols ,Statistical physics ,Pruning (decision trees) ,Condensed Matter - Statistical Mechanics ,Mathematics - Abstract
The effects of a variable amount of random dilution of the synaptic couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents the dying or pruning of synapses and, hence, a static disruption of the learning process which can be considered as a form of multiplicative noise in the learning rule. Both parallel and sequential updating of the neurons can be treated. Symmetric dilution in the statics of the network is studied using the mean-field theory approach of statistical mechanics. General dilution, including asymmetric pruning of the couplings, is examined using the generating functional (path integral) approach of disordered systems. It is shown that random dilution acts as additive gaussian noise in the Hebbian learning rule with a mean zero and a variance depending on the connectivity of the network and on the symmetry. Furthermore, a scaling factor appears that essentially measures the average amount of anti-Hebbian couplings., Comment: 15 pages, 5 figures, to appear in the proceedings of the Conference on Noise in Complex Systems and Stochastic Dynamics II (SPIE International)
- Published
- 2004
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.