36 results on '"Ricciardi, L. M."'
Search Results
2. Exponential Trends of Ornstein-Uhlenbeck First-Passage-Time Densities
- Author
-
Nobile, A. G., Ricciardi, L. M., and Sacerdote, L.
- Published
- 1985
- Full Text
- View/download PDF
3. On an Integral Equation for First-Passage-Time Probability Densities
- Author
-
Ricciardi, L. M., Sacerdote, L., and Sato, S.
- Published
- 1984
- Full Text
- View/download PDF
4. On the Inverse of the First Passage Time Probability Problem
- Author
-
Capocelli, R. M. and Ricciardi, L. M.
- Published
- 1972
- Full Text
- View/download PDF
5. Passive Nonlinear Dendritic Interactions as a Computational Resource in Spiking Neural Networks.
- Author
-
Stöckel, Andreas and Eliasmith, Chris
- Subjects
NEURONS ,MULTIPLICATION - Abstract
Nonlinear interactions in the dendritic tree play a key role in neural computation. Nevertheless, modeling frameworks aimed at the construction of large-scale, functional spiking neural networks, such as the Neural Engineering Framework, tend to assume a linear superposition of postsynaptic currents. In this letter, we present a series of extensions to the Neural Engineering Framework that facilitate the construction of networks incorporating Dale's principle and nonlinear conductance-based synapses. We apply these extensions to a two-compartment LIF neuron that can be seen as a simple model of passive dendritic computation. We show that it is possible to incorporate neuron models with input-dependent nonlinearities into the Neural Engineering Framework without compromising high-level function and that nonlinear postsynaptic currents can be systematically exploited to compute a wide variety of multivariate, band-limited functions, including the Euclidean norm, controlled shunting, and nonnegative multiplication. By avoiding an additional source of spike noise, the function approximation accuracy of a single layer of two-compartment LIF neurons is on a par with or even surpasses that of two-layer spiking neural networks up to a certain target function bandwidth. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
6. Stochastic IMT (Insulator-Metal-Transition) Neurons: An Interplay of Thermal and Threshold Noise at Bifurcation.
- Author
-
Parihar, Abhinav, Jerry, Matthew, Datta, Suman, and Raychowdhury, Arijit
- Subjects
NEURONS ,BIFURCATION theory ,ARTIFICIAL neural networks - Abstract
Artificial neural networks can harness stochasticity in multiple ways to enable a vast class of computationally powerful models. Boltzmann machines and other stochastic neural networks have been shown to outperform their deterministic counterparts by allowing dynamical systems to escape local energy minima. Electronic implementation of such stochastic networks is currently limited to addition of algorithmic noise to digitalmachines which is inherently inefficient; albeit recent efforts to harness physical noise in devices for stochasticity have shown promise. To succeed in fabricating electronic neuromorphic networks we need experimental evidence of devices with measurable and controllable stochasticity which is complemented with the development of reliable statistical models of such observed stochasticity. Current research literature has sparse evidence of the former and a complete lack of the latter. This motivates the current article where we demonstrate a stochastic neuron using an insulator-metal-transition (IMT) device, based on electrically induced phase-transition, in series with a tunable resistance. We show that an IMT neuron has dynamics similar to a piecewise linear FitzHugh-Nagumo (FHN) neuron and incorporates all characteristics of a spiking neuron in the device phenomena. We experimentally demonstrate spontaneous stochastic spiking along with electrically controllable firing probabilities using Vanadium Dioxide (VO
2 ) based IMT neurons which show a sigmoid-like transfer function. The stochastic spiking is explained by two noise sources - thermal noise and threshold fluctuations, which act as precursors of bifurcation. As such, the IMT neuron is modeled as an Ornstein-Uhlenbeck (OU) process with a fluctuating boundary resulting in transfer curves that closely match experiments. The moments of interspike intervals are calculated analytically by extending the first-passage-time (FPT) models for Ornstein-Uhlenbeck (OU) process to include a fluctuating boundary. We find that the coefficient of variation of interspike intervals depend on the relative proportion of thermal and threshold noise, where threshold noise is the dominant source in the current experimental demonstrations. As one of the first comprehensive studies of a stochastic neuron hardware and its statistical properties, this article would enable efficient implementation of a large class of neuro-mimetic networks and algorithms. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
7. Modulation of Context-Dependent Spatiotemporal Patterns within Packets of Spiking Activity.
- Author
-
Miho Itoh and Timothée Leleu
- Subjects
SPATIOTEMPORAL processes ,NEURONS ,COMPUTER simulation ,MATERIAL plasticity ,DYNAMICS - Abstract
Recent experiments have shown that stereotypical spatiotemporal patterns occur during brief packets of spiking activity in the cortex, and it has been suggested that top-down inputs can modulate these patterns according to the context. We propose a simplemodel that may explain important features of these experimental observations and is analytically tractable. The key mechanism underlying this model is that context-dependent topdown inputs can modulate the effective connection strengths between neurons because of short-term synaptic depression. As a result, the degree of synchrony and, in turn, the spatiotemporal patterns of spiking activity that occur during packets are modulated by the top-down inputs. This is shown using an analytical framework, based on avalanche dynamics, that allows calculating the probability that a given neuron spikes during a packet and numerical simulations. Finally, we show that the spatiotemporal patterns that replay previously experienced sequential stimuli and their binding with their corresponding context can be learned because of spike-timing-dependent plasticity. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
8. The aesthetic experience as a characteristic feature of brain dynamics.
- Author
-
Vitiello, Giuseppe
- Subjects
AESTHETIC experience ,AESTHETICS ,ART theory ,BRAIN research ,NEURONS - Abstract
An essay is presented which explores theoretical and experimental advancements in evolutionary and developmental biology, as well as micro-physics of brain dynamics. Overview of the features of the dissipative quantum model of the brain, which describes the collective neuronal activity and brain behavior in terms of microscopic dynamics, is provided. Discussion on the meaning of aesthetic experience and its relationship with the dissipative character of brain dynamics is offered.
- Published
- 2015
9. Analytical approximations of the firing rate of an adaptive exponential integrate-and-fire neuron in the presence of synaptic noise.
- Author
-
Hertäg, Loreen, Durstewitz, Daniel, and Brunel, Nicolas
- Subjects
NEUROSCIENCES ,NEURONS ,POSTSYNAPTIC potential ,VOLTAGE spikes ,NOISE - Abstract
Computational models offer a unique tool for understanding the network-dynamical mechanisms which mediate between physiological and biophysical properties, and behavioral function. A traditional challenge in computational neuroscience is, however, that simple neuronal models which can be studied analytically fail to reproduce the diversity of electrophysiological behaviors seen in real neurons, while detailed neuronal models which do reproduce such diversity are intractable analytically and computationally expensive. A number of intermediate models have been proposed whose aim is to capture the diversity of firing behaviors and spike times of real neurons while entailing the simplest possible mathematical description. One such model is the exponential integrate-and-fire neuron with spike rate adaptation (aEIF) which consists of two differential equations for the membrane potential (V) and an adaptation current (w). Despite its simplicity, it can reproduce a wide variety of physiologically observed spiking patterns, can be fit to physiological recordings quantitatively, and, once done so, is able to predict spike times on traces not used for model fitting. Here we compute the steady-state firing rate of aEIF in the presence of Gaussian synaptic noise, using two approaches. The first approach is based on the 2-dimensional Fokker-Planck equation that describes the (V,w)-probability distribution, which is solved using an expansion in the ratio between the time constants of the two variables. The second is based on the firing rate of the EIF model, which is averaged over the distribution of the w variable. These analytically derived closed-form expressions were tested on simulations from a large variety of model cells quantitatively fitted to in vitro electrophysiological recordings from pyramidal cells and interneurons. Theoretical predictions closely agreed with the firing rate of the simulated cells fed with in-vivo-like synaptic noise. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
10. Genomic instantiation of consciousness in neurons through a biophoton field theory.
- Author
-
Cacha, Lleuvelyn A. and Poznanski, Roman R.
- Subjects
SELF-consciousness (Awareness) ,DNA ,NEURONS ,CONSCIOUSNESS ,DYNAMIC pressure ,STATIC pressure ,SOLITONS - Abstract
A theoretical framework is developed based on the premise that brains evolved into sufficiently complex adaptive systems capable of instantiating genomic consciousness through self-awareness and complex interactions that recognize qualitatively the controlling factors of biological processes. Furthermore, our hypothesis assumes that the collective interactions in neurons yield macroergic effects, which can produce sufficiently strong electric energy fields for electronic excitations to take place on the surface of endogenous structures via alpha-helical integral proteins as electro-solitons. Specifically the process of radiative relaxation of the electro-solitons allows for the transfer of energy via interactions with deoxyribonucleic acid (DNA) molecules to induce conformational changes in DNA molecules producing an ultra weak non-thermal spontaneous emission of coherent biophotons through a quantum effect. The instantiation of coherent biophotons confined in spaces of DNA molecules guides the biophoton field to be instantaneously conducted along the axonal and neuronal arbors and in-between neurons and throughout the cerebral cortex (cortico-thalamic system) and subcortical areas (e.g., midbrain and hindbrain). Thus providing an informational character of the electric coherence of the brain - referred to as quantum coherence. The biophoton field is realized as a conscious field upon the re-absorption of biophotons by exciplex states of DNA molecules. Such quantum phenomenon brings about self-awareness and enables objectivity to have access to subjectivity in the unconscious. As such, subjective experiences can be recalled to consciousness as subjective conscious experiences or qualia through co-operative interactions between exciplex states of DNA molecules and biophotons leading to metabolic activity and energy transfer across proteins as a result of protein-ligand binding during protein-protein communication. The biophoton field as a conscious field is attributable to the resultant effect of specifying qualia from the metabolic energy field that is transported in macromolecular proteins throughout specific networks of neurons that are constantly transforming into more stable associable representations as molecular solitons. The metastability of subjective experiences based on resonant dynamics occurs when bottom-up patterns of neocortical excitatory activity are matched with top-down expectations as adaptive dynamic pressures. These dynamics of on-going activity patterns influenced by the environment and selected as the preferred subjective experience in terms of a functional field through functional interactions and biological laws are realized as subjectivity and actualized through functional integration as qualia. It is concluded that interactionism and not information processing is the key in understanding how consciousness bridges the explanatory gap between subjective experiences and their neural correlates in the transcendental brain. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
11. Analytical Integrate-and-Fire Neuron Models with Conductance-Based Dynamics and Realistic Postsynaptic Potential Time Course for Event-Driven Simulation Strategies.
- Author
-
Rudolph-Lilith, Michelle, Dubois, Mathieu, and Destexhe, Alain
- Subjects
SIMULATION methods & models ,NEURONS ,DIFFERENTIAL equations ,APPROXIMATION theory ,EXPONENTIAL functions ,DISCONTINUOUS functions ,ARTIFICIAL neural networks - Abstract
In a previous paper (Rudolph & Destexhe, 2006), we proposed various models, the gIF neuron models, of analytical integrate-and-fire (IF) neurons with conductance-based (COBA) dynamics for use in event-driven simulations. These models are based on an analytical approximation of the differential equation describing the IF neuron with exponential synaptic conductance's and were successfully tested with respect to their response to random and oscillating inputs. Because they are analytical and mathematically simple, the gIF models are best suited for fast event-driven simulation strategies. However, the drawback of such models is they rely on a nonrealistic postsynaptic potential (PSP) time course, consisting of a discontinuous jump followed by a decay governed by the membrane time constant. Here, we address this limitation by conceiving an analytical approximation of the COBA IF neuron model with the full PSP time course. The subthreshold and suprathreshold response of this gIF4 model reproduces remarkably well the postsynaptic responses of the numerically solved passive membrane equation subject to conductance noise, while gaining at least two orders of magnitude in computational performance. Although the analytical structure of the gIF4 model is more complex than that of its predecessors due to the necessity of calculating future spike times, a simple and fast algorithmic implementation for use in large-scale neural network simulations is proposed. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
12. Estimation of Time-Dependent Input from Neuronal Membrane Potential.
- Author
-
Kobayashi, Ryota, Shinomoto, Shigeru, and Lansky, Petr
- Subjects
NEURONS ,PRESYNAPTIC receptors ,STOCHASTIC processes ,STATE-space methods ,CELLULAR signal transduction ,BIOLOGY experiments - Abstract
The set of firing rates of the presynaptic excitatory and inhibitory neurons constitutes the input signal to the postsynaptic neuron. Estimation of the time-varying input rates from intracellularly recorded membrane potential is investigated here. For that purpose, the membrane potential dynamics must be specified. We consider the Ornstein-Uhlenbeck stochastic process, one of the most common single-neuron models, with time-dependent mean and variance. Assuming the slow variation of these two moments, it is possible to formulate the estimation problem by using a state-space model. We develop an algorithm that estimates the paths of the mean and variance of the input current by using the empirical Bayes approach. Then the input firing rates are directly available from the moments. The proposed method is applied to three simulated data examples: constant signal, sinusoidally modulated signal, and constant signal with a jump. For the constant signal, the estimation performance of the method is comparable to that of the traditionally applied maximum likelihood method. Further, the proposed method accurately estimates both continuous and discontinuous time-variable signals. In the case of the signal with a jump, which does not satisfy the assumption of slow variability, the robustness of the method is verified. It can be concluded that the method provides reliable estimates of the total input firing rates, which are not experimentally measurable. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
13. Estimating Parameters of Generalized Integrate-and-Fire Neurons from the Maximum Likelihood of Spike Trains.
- Author
-
Yi Dong, Mihalas, Stefan, Russell, Alexander, Etienne-Cummings, Ralph, and Niebur, Ernst
- Subjects
NEUROSCIENCES ,NEURONS ,MAXIMUM likelihood statistics ,MATHEMATICAL models ,NONLINEAR functional analysis ,BRAIN research ,ESTIMATION theory - Abstract
When a neuronal spike train is observed, what can we deduce from it about the properties of the neuron that generated it? A natural way to answer this question is to make an assumption about the type of neuron, select an appropriate model for this type, and then choose the model parameters as those that are most likely to generate the observed spike train. This is the maximum likelihood method. If the neuron obeys simple integrate-and-fire dynamics, Paninski, Pillow, and Simoncelli (2004) showed that its negative log-likelihood function is convex and that, at least in principle, its unique global minimum can thus be found by gradient descent techniques. Many biological neurons are, however, known to generate a richer repertoire of spiking behaviors than can be explained in a simple integrate-and-fire model. For instance, such a model retains only an implicit (through spike-induced currents), not an explicit, memory of its input; an example of a physiological situation that cannot be explained is the absence of firing if the input current is increased very slowly. Therefore, we use an expanded model (Mihalas & Niebur, 2009), which is capable of generating a large number of complex firing patterns while still being linear. Linearity is important because it maintains the distribution of the random variables and still allows maximum likelihood methods to be used. In this study, we show that although convexity of the negative log-likelihood function is not guaranteed for this model, the minimum of this function yields a good estimate for the model parameters, in particular if the noise level is treated as a free parameter. Furthermore, we show that a nonlinear function minimization method (r-algorithm with space dilation) usually reaches the global minimum. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
14. A LOWER BOUND FOR THE FIRST PASSAGE TIME DENSITY OF THE SUPRATHRESHOLD ORNSTEIN--UHLENBECK PROCESS.
- Author
-
THOMAS, PETER J.
- Subjects
ORNSTEIN-Uhlenbeck process ,SYNCHRONIZATION ,MATHEMATICAL models ,STOCHASTIC analysis ,FORCING (Model theory) ,DISTRIBUTION (Probability theory) ,NEURONS - Abstract
We prove that the first passage time density ρ(t) for an Ornstein--Uhlenbeck process X(t) obeying dX = -β X dt + σ dW to reach a fixed threshold θ from a suprathreshold initial condition x
0 > θ > 0 has a lower bound of the form ρ (t) > k exp[-pe6βt ] for positive constants k and p for times t exceeding some positive value u. We obtain explicit expressions for k, p, and u in terms of β, σ, x0 , and θ, and discuss the application of the results to the synchronization of periodically forced stochastic leaky integrate-and-fire model neurons. [ABSTRACT FROM AUTHOR]- Published
- 2011
- Full Text
- View/download PDF
15. On a Stochastic Leaky Integrate-and-Fire NeuronalModel.
- Author
-
A. Buonocore, L. Caputo, E. Pirozzi, and L.M. Ricciardi
- Subjects
NEURONS ,STOCHASTIC approximation ,GAUSSIAN distribution ,PROBABILITY theory ,VOLTERRA operators - Abstract
The leaky integrate-and-fire neuronal model proposed in Stevens and Zador (1998), in which time constant and resting potential are postulated to be time dependent, is revisited within a stochastic framework in which the membrane potential is mathematically described as a gaussdiffusion process. The first-passage-time probability density, miming in such a context the firing probability density, is evaluated by either the Volterra integral equation of Buonocore, Nobile, and Ricciardi (1987) or, when possible, by the asymptotics of Giorno, Nobile, and Ricciardi (1990). The model examined here represents an extension of the classic leaky integrate-and-fire one based on the Ornstein-Uhlenbeck process in that it is in principle compatible with the inclusion of some other physiological characteristics such as relative refractoriness. It also allows finer tuning possibilities in view of its accounting for certain qualitative as well as quantitative features, such as the behavior of the time course of the membrane potential prior to firings and the computation of experimentally measurable statistical descriptors of the firing time: mean, median, coefficient of variation, and skewness. Finally, implementations of this model are provided in connection with certain experimental evidence discussed in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
16. Response of Integrate-and-Fire Neurons to Noisy Inputs Filtered by Synapses with Arbitrary Timescales: Firing Rate and Correlations.
- Author
-
Moreno-Bote, Rubén and Parga, Néstor
- Subjects
NEURONS ,NOISE ,NEURAL transmission ,SENSORY receptors ,SIGNAL processing ,STOCHASTIC processes ,SYNAPSES - Abstract
Delivery of neurotransmitter produces on a synapse a current that flows through the membrane and gets transmitted into the soma of the neuron, where it is integrated. The decay time of the current depends on the synaptic receptor's type and ranges from a few (e.g., AMPA receptors) to a few hundred milliseconds (e.g., NMDA receptors). The role of the variety of synaptic timescales, several of them coexisting in the same neuron, is at present not understood. A prime question to answer is which is the effect of temporal filtering at different timescales of the incoming spike trains on the neuron's response. Here, based on our previous work on linear synaptic filtering, we build a general theory for the stationary firing response of integrate-and-fire (IF) neurons receiving stochastic inputs filtered by one, two, or multiple synaptic channels, each characterized by an arbitrary timescale. The formalism applies to arbitrary IF model neurons and arbitrary forms of input noise (i.e., not required to be gaussian or to have small amplitude), as well as to any form of synaptic filtering (linear or nonlinear). The theory determines with exact analytical expressions the firing rate of an IF neuron for long synaptic time constants using the adiabatic approach. The correlated spiking (cross-correlations function) of two neurons receiving common as well as independent sources of noise is also described. The theory is illustrated using leaky, quadratic, and noise-thresholded IF neurons. Although the adiabatic approach is exact when at least one of the synaptic timescales is long, it provides a good prediction of the firing rate even when the timescales of the synapses are comparable to that of the leak of the neuron; it is not required that the synaptic time constants are longer than the mean interspike intervals or that the noise has small variance. The distribution of the potential for general IF neurons is also characterized. Our results provide powerful analytical tools that can allow a quantitative description of the dynamics of neuronal networks with realistic synaptic dynamics. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
17. Theory of Input Spike Auto- and Cross-Correlations and Their Effect on the Response of Spiking Neurons.
- Author
-
Moreno-Bote, Rubén, Renart, Alfonso, and Parga, Néstor
- Subjects
NEURONS ,NERVOUS system ,PAIRING correlations (Nuclear physics) ,NEUROSCIENCES ,FOKKER-Planck equation - Abstract
Spike correlations between neurons are ubiquitous in the cortex, but their role is not understood. Here we describe the firing response of a leaky integrate-and-fire neuron (LIF) when it receives a temporarily correlated input generated by presynaptic correlated neuronal populations. Input correlations are characterized in terms of the firing rates, Fano factors, correlation coefficients, and correlation timescale of the neurons driving the target neuron. We show that the sum of the presynaptic spike trains cannot be well described by a Poisson process. In fact, the total input current has a nontrivial two-point correlation function described by two main parameters: the correlation timescale (how precise the input correlations are in time) and the correlation magnitude (how strong they are). Therefore, the total current generated by the input spike trains is not well described by a white noise gaussian process. Instead, we model the total current as a colored gaussian process with the same mean and two-point correlation function, leading to the formulation of the problem in terms of a Fokker-Planck equation. Solutions of the output firing rate are found in the limit of short and long correlation timescales. The solutions described here expand and improve on our previous results (Moreno, de la Rocha, Renart, & Parga, 2002) by presenting new analytical expressions for the output firing rate for general IF neurons, extending the validity of the results for arbitrarily large correlation magnitude, and by describing the differential effect of correlations on the mean-driven or noise-dominated firing regimes. Also the details of this novel formalism are given here for the first time. We employ numerical simulations to confirm the analytical solutions and study the firing response to sudden changes in the input correlations. We expect this formalism to be useful for the study of correlations in neuronal networks and their role in neural processing and information transmission. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
18. Correlation between neural spike trains increases with firing rate.
- Author
-
de la Rocha, Jaime, Doiron, Brent, Shea-Brown, Eric, Josić, Krešimir, and Reyes, Alex
- Subjects
STATISTICAL correlation ,NEURONS ,THALAMUS ,SOMATOSENSORY evoked potentials ,HETEROGENEITY ,SYNAPSES ,LINEAR statistical models ,BIOLOGICAL neural networks - Abstract
Populations of neurons in the retina, olfactory system, visual and somatosensory thalamus, and several cortical regions show temporal correlation between the discharge times of their action potentials (spike trains). Correlated firing has been linked to stimulus encoding, attention, stimulus discrimination, and motor behaviour. Nevertheless, the mechanisms underlying correlated spiking are poorly understood, and its coding implications are still debated. It is not clear, for instance, whether correlations between the discharges of two neurons are determined solely by the correlation between their afferent currents, or whether they also depend on the mean and variance of the input. We addressed this question by computing the spike train correlation coefficient of unconnected pairs of in vitro cortical neurons receiving correlated inputs. Notably, even when the input correlation remained fixed, the spike train output correlation increased with the firing rate, but was largely independent of spike train variability. With a combination of analytical techniques and numerical simulations using ‘integrate-and-fire’ neuron models we show that this relationship between output correlation and firing rate is robust to input heterogeneities. Finally, this overlooked relationship is replicated by a standard threshold-linear model, demonstrating the universality of the result. This connection between the rate and correlation of spiking activity links two fundamental features of the neural code. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
19. Mean-Driven and Fluctuation-Driven Persistent Activity in Recurrent Networks.
- Author
-
Renart, Alfonso, Moreno-Bote, Rubén, Xiao-Jing Wang, and Parga, Néstor
- Subjects
NEURONS ,SHORT-term memory ,POISSON processes ,MEAN field theory ,NEURAL circuitry ,BIOLOGICAL neural networks - Abstract
Spike trains from cortical neurons show a high degree of irregularity, with coefficients of variation (CV) of their interspike interval (ISI) distribution close to or higher than one. It has been suggested that this irregularity might be a reflection of a particular dynamical state of the local cortical circuit in which excitation and inhibition balance each other. In this "balanced" state, themean current to the neurons is below threshold, and firing is driven by current fluctuations, resulting in irregular Poisson-like spike trains. Recent data show that the degree of irregularity in neuronal spike trains recorded during the delay period of working memory experiments is the same for both low-activity states of a few Hz and for elevated, persistent activity states of a few tens of Hz. Since the difference between these persistent activity states cannot be due to external factors coming from sensory inputs, this suggests that the underlying network dynamics might support coexisting balanced states at different firing rates.We use mean field techniques to study the possible existence of multiple balanced steady states in recurrent networks of current-based leaky integrate-and-fire (LIF) neurons. To assess the degree of balance of a steady state, we extend existing mean-field theories so that not only the firing rate, but also the coefficient of variation of the interspike interval distribution of the neurons, are determined self-consistently. Depending on the connectivity parameters of the network,we find bistable solutions of different types. If the local recurrent connectivity is mainly excitatory, the two stable steady states differ mainly in the mean current to the neurons. In this case, the mean drive in the elevated persistent activity state is suprathreshold and typically characterized by low spiking irregularity. If the local recurrent excitatory and inhibitory drives are both large and nearly balanced, or even dominated by inhibition, two stable states coexist, both with subthreshold current drive. In this case, the spiking variability in both the resting state and the mnemonic persistent state is large, but the balance condition implies parameter fine-tuning. Since the degree of required fine-tuning increases with network size and, on the other hand, the size of the fluctuations in the afferent current to the cells increases for small networks, overall we find that fluctuation-driven persistent activity in the very simplified type of models we analyze is not a robust phenomenon. Possible implications of considering more realistic models are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
20. The Effect of NMDA Receptors on Gain Modulation.
- Author
-
Berends, Michiel, Maex, Reinoud, and de Schutter, Erik
- Subjects
NEURONS ,METHYL aspartate ,ASPARTIC acid ,EXCITATORY amino acids ,NEURAL circuitry ,NEURAL transmission ,NERVOUS system - Abstract
The ability of individual neurons to modulate the gain of their input-output function is important for information processing in the brain. In a recent study (Mitchell & Silver, 2003), shunting inhibition was found to modulate the gain of cerebellar granule cells subjected to simulated currents through AMPA receptor synapses. Here we investigate the effect on gain modulation resulting from adding the currents mediated by NMDA receptors to a compartmental model of the granule cell. With only AMPA receptors, the changes in gain induced by shunting inhibition decreased gradually with the average firing rate of the afferent mossy fibers. With NMDA receptors present, this decrease was more rapid, therefore narrowing the bandwidth of mossy fiber firing rates available for gain modulation. The deterioration of gain modulation was accompanied by a reduced variability of the input current and saturation of NMDA receptors. However, when the output of the granule cell was plotted as a function of the average input current instead of the input firing frequency, both models showed very similar response curves and comparable gain modulation. We conclude that NMDA receptors do not directly impair gain control by shunting inhibition, but the effective bandwidth decreases as a consequence of the increased total charge transfer. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
21. Mean Instantaneous Firing Frequency Is Always Higher Than the Firing Rate.
- Author
-
Lánský, Petr, Rodriguez, Roger, and Sacerdote, Laura
- Subjects
NEURONS ,TRANSFER functions ,NEURAL transmission ,CONTROL (Psychology) ,NERVOUS system ,NEUROPHYSIOLOGY - Abstract
Frequency coding is considered one of the most common coding strategies employed by neural systems. This fact leads, in experiments as well as in theoretical studies, to construction of so-called transfer functions, where the output firing frequency is plotted against the input intensity. The term firing frequency can be understood differently in different contexts. Basically, it means that the number of spikes over an interval of preselected length is counted and then divided by the length of the interval, but due to the obvious limitations, the length of observation cannot be arbitrarily long. Then firing frequency is defined as reciprocal to the mean interspike interval. In parallel, an instantaneous firing frequency can be defined as reciprocal to the length of current interspike interval, and by taking a mean of these, the definition can be extended to introduce the mean instantaneous firing frequency. All of these definitions of firing frequency are compared in an effort to contribute to a better understanding of the input-output properties of a neuron. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
22. Dynamics of Deterministic and Stochastic Paired Excitatory–Inhibitory Delayed Feedback.
- Author
-
Laing, Carlo R. and Longtin, André
- Subjects
NEURONS ,BIFURCATION theory ,NUMERICAL analysis ,OSCILLATIONS ,NERVOUS system - Abstract
We examine the effects of paired delayed excitatory and inhibitory feedback on a single integrate–and–fire neuron with reversal potentials embedded within a feedback network. These effects are studied using bifurcation theory and numerical analysis. The feedback occurs through modulation of the excitatory and inhibitory conductances by the previous firing history of the neuron; as a consequence, the feedback also modifies the membrane time constant. Such paired feedback is ubiquitous in the nervous system. We assume that the feedback dynamics are slower than the membrane time constant, which leads to a rate model formulation. Our article provides an extensive analysis of the possible dynamical behaviors of such simple yet realistic neural loops as a function of the balance between positive and negative feedback, with and without noise, and offers insight into the potential behaviors such loops can exhibit in response to time-varying external inputs. With excitatory feedback, the system can be quiescent, can be periodically firing, or can exhibit bistability between these two states. With inhibitory feedback, quiescence, oscillatory firing rates, and bistability between constant and oscillatory firing-rate solutions are possible. The general case of paired feedback exhibits a blend of the behaviors seen in the extreme cases and can produce chaotic firing. We further derive a condition for a dynamically balanced paired feedback in which there is neither bistability nor oscillations. We also show how a biophysically plausible smoothing of the firing function by noise can modify the existence and stability of fixed points and oscillations of the system. We take advantage in our mathematical analysis of the existence of an invariant manifold, which reduces the dimensionality of the dynamics, and prove the stability of this manifold. The novel computational challenges involved in analyzing such dynamics with and without noise are also described. Our results demonstrate that a paired delayed feedback loop can act as a sophisticated computational unit, capable of switching between a variety of behaviors depending on the input current, the relative strengths and asymmetry of the two parallel feedback pathways, and the delay distributions and noise level. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
23. Characterization of Subthreshold Voltage Fluctuations in Neuronal Membranes.
- Author
-
Rudolph, M. and Destexhe, A.
- Subjects
NEURONS ,FOKKER-Planck equation ,DENSITY functionals - Abstract
Synaptic noise due to intense network activity can have a significant impact on the electrophysiological properties of individual neurons. This is the case for the cerebral cortex, where ongoing activity leads to strong barrages of synaptic inputs, which act as the main source of synaptic noise affecting on neuronal dynamics. Here, we characterize the subthreshold behavior of neuronal models in which synaptic noise is represented by either additive or multiplicative noise, described by OrnsteinUhlenbeck processes. We derive and solve the Fokker-Planck equation for this system, which describes the time evolution of the probability density function for the membrane potential. We obtain an analytic expression for the membrane potential distribution at steady state and compare this expression with the subthreshold activity obtained in Hodgkin-Huxley-type models with stochastic synaptic inputs. The differences between multiplicative and additive noise models suggest that multiplicative noise is adequate to describe the high-conductance states similar to in vivo conditions. Because the steady-state membrane potential distribution is easily obtained experimentally, this approach provides a possible method to estimate the mean and variance of synaptic conductances in real neurons. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
24. Ergodicity of Spike Trains: When Does Trial Averaging Make Sense?
- Author
-
Masuda, Naoki and Aihara, Kazuyuki
- Subjects
NEURONS ,NEUROTRANSMITTERS - Abstract
Neuronal information processing is often studied on the basis of spiking patterns. The relevant statistics such as firing rates calculated with the peri-stimulus time histogram are obtained by averaging spiking patterns over many experimental runs. However, animals should respond to one experimental stimulation in real situations, and what is available to the brain is not the trial statistics but the population statistics. Consequently, physiological ergodicity, namely, the consistency between trial averaging and population averaging, is implicitly assumed in the data analyses, although it does not trivially hold true. In this letter, we investigate how characteristics of noisy neural network models, such as single neuron properties, external stimuli, and synaptic inputs, affect the statistics of firing patterns. In particular, we show that how high membrane potential sensitivity to input fluctuations, inability of neurons to remember past inputs, external stimuli with large variability and temporally separated peaks, and relatively few contributions of synaptic inputs result in spike trains that are reproducible over many trials. The reproducibility of spike trains and synchronous firing are contrasted and related to the ergodicity issue. Several numerical calculations with neural network examples are carried out to support the theoretical results. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
25. Interspike Interval Correlations, Memory, Adaptation, and Refractoriness in a Leaky Integrate-and-Fire Model with Threshold Fatigue.
- Author
-
Chacron, Maurice J., Pakdaman, Khashayar, and Longtin, André
- Subjects
NEUROPLASTICITY ,NEURONS ,MEMORY - Abstract
Neuronal adaptation as well as interdischarge interval correlations have been shown to be functionally important properties of physiological neurons. We explore the dynamics of a modified leaky integrate-and-fire (LIF) neuron, referred to as the LIF with threshold fatigue, and show that it reproduces these properties. In this model, the postdischarge threshold reset depends on the preceding sequence of discharge times. We show that in response to various classes of stimuli, namely, constant currents, step currents, white gaussian noise, and sinusoidal currents, the model exhibits new behavior compared with the standard LIF neuron. More precisely, (1) step currents lead to adaptation, that is, a progressive decrease of the discharge rate following the stimulus onset, while in the standard LIF, no such patterns are possible; (2) a saturation in the firing rate occurs in certain regimes, a behavior not seen in the LIF neuron; (3) interspike intervals of the noise-driven modified LIF under constant current are correlated in a way reminiscent of experimental observations, while those of the standard LIF are independent of one another; (4) the magnitude of the correlation coefficients decreases as a function of noise intensity; and (5) the dynamics of the sinusoidally forced modified LIF are described by iterates of an annulus map, an extension to the circle map dynamics displayed by the LIF model. Under certain conditions, this map can give rise to sensitivity to initial conditions and thus chaotic behavior. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
26. Integrate-and-Fire Neurons Driven by Correlated Stochastic Input.
- Author
-
Salinas, Emilio and Sejnowski, Terrence J.
- Subjects
NEURONS ,NEURAL transmission ,STOCHASTIC analysis - Abstract
Neurons are sensitive to correlations among synaptic inputs. However, analytical models that explicitly include correlations are hard to solve analytically, so their influence on a neuron's response has been difficult to ascertain. To gain some intuition on this problem, we studied the firing times of two simple integrate-and-fire model neurons driven by a correlated binary variable that represents the total input current. Analytic expressions were obtained for the average firing rate and coefficient of variation (a measure of spike-train variability) as functions of the mean, variance, and correlation time of the stochastic input. The results of computer simulations were in excellent agreement with these expressions. In these models, an increase in correlation time in general produces an increase in both the average firing rate and the variability of the output spike trains. However, the magnitude of the changes depends differentially on the relative values of the input mean and variance: the increase in firing rate is higher when the variance is large relative to the mean, whereas the increase in variability is higher when the variance is relatively small. In addition, the firing rate always tends to a finite limit value as the correlation time increases toward infinity, whereas the coefficient of variation typically diverges. These results suggest that temporal correlations may play a major role in determining the variability as well as the intensity of neuronal spike trains. [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
27. Impact of Geometrical Structures on the Output of Neuronal Models: A Theoretical and Numerical Analysis.
- Author
-
Feng, Jianfeng and Li, Guibin
- Subjects
NEURONS ,GEOMETRY ,NUMERICAL analysis - Abstract
What is the difference between the efferent spike train of a neuron with a large soma versus that of a neuron with a small soma? We propose an analytical method called the decoupling approach to tackle the problem. Two limiting cases—the soma is much smaller than the dendrite or vica versa—are theoretically investigated. For both the two-compartment integrate-and-fire model and Pinsky-Rinzel model, we show, both theoretically and numerically, that the smaller the soma is, the faster and the more irregularly the neuron fires. We further conclude, in terms of numerical simulations, that cells falling in between the two limiting cases form a continuum with respect to their firing properties (mean firing time and coefficient of variation of inter-spike intervals). [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
28. Period Focusing Induced by Network Feedback in Populations of Noisy Integrate-and-Fire Neurons.
- Author
-
Rodríguez, Francisco B., Suárez, Alberto, and López, Vicente
- Subjects
NEURONS ,STOCHASTIC processes ,GEOMETRIC modeling - Abstract
The population dynamics of an ensemble of nonleaky integrate-and-fire stochastic neurons is studied. The model selected allows for a detailed analysis of situations where noise plays a dominant role. Simulations in a regime with weak to moderate interactions show that a mechanism of excitatory message interchange among the neurons leads to a decrease in the firing period dispersion of the individual units. The dispersion reduction observed is larger than what would be expected from the decrease in the period. This "period focusing" is explained using a mean-field model. It is a dynamical effect that arises from the progressive decrease of the effective firing threshold as a result of the messages received by each unit from the rest of the population. A back-of-the-envelope formula to calculate this nontrivial dispersion reduction and a simple geometrical description of the effect are also provided. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
29. Impact of Correlated Inputs on the Output of the Integrate-and-Fire Model.
- Author
-
Brown, David and Feng, Jianfeng
- Subjects
STATISTICAL correlation ,CELL communication ,NEURONS - Abstract
For the integrate-and-fire model with or without reversal potentials, we consider how correlated inputs affect the variability of cellular output. For both models, the variability of efferent spike trains measured by coefficient of variation (CV) of the interspike interval is a nondecreasing function of input correlation. When the correlation coefficient is greater than 0.09, the CV of the integrate-and-fire model without reversal potentials is always above 0.5, no matter how strong the inhibitory inputs. When the correlation coefficient is greater than 0.05, CV for the integrateand-fire model with reversal potentials is always above 0.5, independent of the strength of the inhibitory inputs. Under a given condition on correlation coefficients, we find that correlated Poisson processes can be decomposed into independent Poisson processes. We also develop a novel method to estimate the distribution density of the first passage time of the integrate-and-fire model. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
30. Event-driven contrastive divergence: neural sampling foundations.
- Author
-
Neftci, Emre, Das, Srinjoy, Pedroni, Bruno, Kreutz-Delgado, Kenneth, and Cauwenberghs, Gert
- Subjects
NEURONS ,MARKOV chain Monte Carlo ,STOCHASTIC analysis ,BOLTZMANN machine ,ARTIFICIAL neural networks - Abstract
The author discusses the use of contrastive divergence (CD) framework in neural sampling. Topics discussed include the capability of neuron models to perform Boltzmann distribution sampling using Markov Chain Monte Carlo (MCMC) method, the stochastic nature of neural sampling, and the use of Boltzmann machine in evaluating integrate and fire (IF) neural network.
- Published
- 2015
- Full Text
- View/download PDF
31. The Ornstein-Uhlenbeck Process Does Not Reproduce Spiking Statistics of Neurons in Prefrontal Cortex.
- Author
-
Shinomoto, Shigeru, Sakai, Yutaka, and Funahashi, Shintaro
- Subjects
NEURONS ,PREFRONTAL cortex - Abstract
Cortical neurons of behaving animals generate irregular spike sequences. Recently, there has been a heated discussion about the origin of this irregularity. Softky and Koch (1993) pointed out the inability of standard single-neuron models to reproduce the irregularity of the observed spike sequences when the model parameters are chosen within a certain range that they consider to be plausible. Shadlen and Newsome (1994), on the other hand, demonstrated that a standard leaky integrate-and-fire model can reproduce the irregularity if the inhibition is balanced with the excitation. Motivated by this discussion, we attempted to determine whether the Ornstein-Uhlenbeck process, which is naturally derived from the leaky integration assumption, can in fact reproduce higher-order statistics of biological data. For this purpose, we consider actual neuronal spike sequences recorded from the monkey prefrontal cortex to calculate the higher-order statistics of the interspike intervals. Consistency of the data with the model is examined on the basis of the coefficient of variation and the skewness coefficient, which are, respectively, a measure of the spiking irregularity and a measure of the asymmetry of the interval distribution. It is found that the biological data are not consistent with the model if the model time constant assumes a value within a certain range believed to cover all reasonable values. This fact suggests that the leaky integrate-and-fire model with the assumption of uncorrelated inputs is not adequate to account for the spiking in at least some cortical neurons. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
32. Fast Temporal Encoding and Decoding with Spiking Neurons.
- Author
-
Horn, David and Levanda, Sharon
- Subjects
NEURONS ,HUMAN information processing - Abstract
We propose a simple theoretical structure of interacting integrate-and-fire neurons that can handle fast information processing and may account for the fact that only a few neuronal spikes suffice to transmit information in the brain. Using integrate-and-fire neurons that are subjected to individual noise and to a common external input, we calculate their first passage time (FPT), or interspike interval. We suggest using a population average for evaluating the FPT that represents the desired information. Instantaneous lateral excitation among these neurons helps the analysis. By employing a second layer of neurons with variable connections to the first layer, we represent the strength of the input by the number of output neurons that fire, thus decoding the temporal information. Such a model can easily lead to a logarithmic relation as in Weber's law. The latter follows naturally from information maximization if the input strength is statistically distributed according to an approximate inverse law. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
33. Noise adaptation in integrate-and-fire neurons.
- Author
-
Rudd, Michael E. and Brown, Lawrence G.
- Subjects
NEURONS - Abstract
Analyzes the statistical spiking behavior of stochastic integrate-and-fire neurons. Effect of reset mechanism on dynamics of average neural spiking rate; Suppression of neuron with membrane leak in reset mechanism; Adaptation of negative domain in noise input; Reflection of barriers in alternative boundary conditions; Description of generator potential distributions of density functions.
- Published
- 1997
- Full Text
- View/download PDF
34. Spiking mechanisms of cortical neurons.
- Author
-
Shinomoto, Shigeru and Sakai, Yutaka
- Subjects
NEURONS ,CEREBRAL cortex - Abstract
Spike sequences of in-vivo cortical neurons are highly irregular. Recently, Softky and Koch addressed the question of the suitability of the integrate-andfire model for the spiking mechanism in the light of spiking irregularity. We consider real neuronal spike sequences and examine whether this model is really unsatisfactory for explaining actual data. It is found that the leaky integrate-and-fire model is not completely inconsistent with the real data. Detailed analysis, however, shows a need for modifications of the model. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
35. Realistic neurons can compute the operations needed by quantum probability theory and other vector symbolic architectures.
- Author
-
Stewart, Terrence C. and Eliasmith, Chris
- Subjects
NEURONS ,QUANTUM theory ,PROBABILITY theory ,INFORMATION processing ,COGNITION ,ALGEBRAIC functions - Abstract
Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B). [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
36. Spiking Neuron Models : Single Neurons, Populations, Plasticity
- Author
-
Wulfram Gerstner, Werner M. Kistler, Wulfram Gerstner, and Werner M. Kistler
- Subjects
- Neural circuitry, Neural networks (Neurobiology), Neurons, Computational neuroscience, Neuroplasticity
- Abstract
Neurons in the brain communicate by short electrical pulses, the so-called action potentials or spikes. How can we understand the process of spike generation? How can we understand information transmission by neurons? What happens if thousands of neurons are coupled together in a seemingly random network? How does the network connectivity determine the activity patterns? And, vice versa, how does the spike activity influence the connectivity pattern? These questions are addressed in this 2002 introduction to spiking neurons aimed at those taking courses in computational neuroscience, theoretical biology, biophysics, or neural networks. The approach will suit students of physics, mathematics, or computer science; it will also be useful for biologists who are interested in mathematical modelling. The text is enhanced by many worked examples and illustrations. There are no mathematical prerequisites beyond what the audience would meet as undergraduates: more advanced techniques are introduced in an elementary, concrete fashion when needed.
- Published
- 2002
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.