Back to Search Start Over

Impact of dendritic non-linearities on the computational capabilities of neurons

Authors :
Lauditi, Clarissa
Malatesta, Enrico M.
Pittorino, Fabrizio
Baldassi, Carlo
Brunel, Nicolas
Zecchina, Riccardo
Publication Year :
2024

Abstract

Multiple neurophysiological experiments have shown that dendritic non-linearities can have a strong influence on synaptic input integration. In this work we model a single neuron as a two-layer computational unit with non-overlapping sign-constrained synaptic weights and a biologically plausible form of dendritic non-linearity, which is analytically tractable using statistical physics methods. Using both analytical and numerical tools, we demonstrate several key computational advantages of non-linear dendritic integration with respect to models with linear synaptic integration. We find that the dendritic non-linearity concurrently enhances the number of possible learned input-output associations and the learning velocity, and we characterize how capacity and learning speed depend on the implemented non-linearity and the levels of dendritic and somatic inhibition. We find that experimentally observed connection probabilities naturally emerge in neurons with sign-constrained synapses as a consequence of non-linear dendritic integration, while in models with linear integration, an additional robustness parameter must be introduced in order to reproduce realistic connection probabilities. Non-linearly induced sparsity comes with a second central advantage for neuronal information processing, i.e. input and synaptic noise robustness. By testing our model on standard real-world benchmark datasets inspired by deep learning practice, we observe empirically that the non-linearity provides an enhancement in generalization performance, showing that it enables to capture more complex input/output relations.<br />Comment: 35 pages, 11 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.07572
Document Type :
Working Paper