Back to Search Start Over

Improving Performance in Neural Networks by Dendrites-Activated Connections

Authors :
Metta, Carlo
Fantozzi, Marco
Papini, Andrea
Amato, Gianluca
Bergamaschi, Matteo
Galfrè, Silvia Giulia
Marchetti, Alessandro
Vegliò, Michelangelo
Parton, Maurizio
Morandin, Francesco
Publication Year :
2023

Abstract

Computational units in artificial neural networks compute a linear combination of their inputs, and then apply a nonlinear filter, often a ReLU shifted by some bias, and if the inputs come themselves from other units, they were already filtered with their own biases. In a layer, multiple units share the same inputs, and each input was filtered with a unique bias, resulting in output values being based on shared input biases rather than individual optimal ones. To mitigate this issue, we introduce DAC, a new computational unit based on preactivation and multiple biases, where input signals undergo independent nonlinear filtering before the linear combination. We provide a Keras implementation and report its computational efficiency. We test DAC convolutions in ResNet architectures on CIFAR-10, CIFAR-100, Imagenette, and Imagewoof, and achieve performance improvements of up to 1.73%. We exhibit examples where DAC is more efficient than its standard counterpart as a function approximator, and we prove a universal representation theorem.<br />Major rewriting. Superseeds v1

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....78544b8643e4149f70cbdc50e59c9222