Back to Search
Start Over
Temporal Complexity of a Hopfield-Type Neural Model in Random and Scale-Free Graphs
- Publication Year :
- 2024
-
Abstract
- The Hopfield network model and its generalizations were introduced as a model of associative, or content-addressable, memory. They were widely investigated both as an unsupervised learning method in artificial intelligence and as a model of biological neural dynamics in computational neuroscience. The complexity features of biological neural networks have attracted the scientific community's interest for the last two decades. More recently, concepts and tools borrowed from complex network theory were applied to artificial neural networks and learning, thus focusing on the topological aspects. However, the temporal structure is also a crucial property displayed by biological neural networks and investigated in the framework of systems displaying complex intermittency. The Intermittency-Driven Complexity (IDC) approach indeed focuses on the metastability of self-organized states, whose signature is a power-decay in the inter-event time distribution or a scaling behaviour in the related event-driven diffusion processes. The investigation of IDC in neural dynamics and its relationship with network topology is still in its early stages. In this work, we present the preliminary results of an IDC analysis carried out on a bio-inspired Hopfield-type neural network comparing two different connectivities, i.e., scale-free vs. random network topology. We found that random networks can trigger complexity features similar to that of scale-free networks, even if with some differences and for different parameter values, in particular for different noise levels
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2406.12895
- Document Type :
- Working Paper