Back to Search Start Over

Synaptic Stripping: How Pruning Can Bring Dead Neurons Back To Life

Authors :
Whitaker, Tim
Whitley, Darrell
Publication Year :
2023

Abstract

Rectified Linear Units (ReLU) are the default choice for activation functions in deep neural networks. While they demonstrate excellent empirical performance, ReLU activations can fall victim to the dead neuron problem. In these cases, the weights feeding into a neuron end up being pushed into a state where the neuron outputs zero for all inputs. Consequently, the gradient is also zero for all inputs, which means that the weights which feed into the neuron cannot update. The neuron is not able to recover from direct back propagation and model capacity is reduced as those parameters can no longer be further optimized. Inspired by a neurological process of the same name, we introduce Synaptic Stripping as a means to combat this dead neuron problem. By automatically removing problematic connections during training, we can regenerate dead neurons and significantly improve model capacity and parametric utilization. Synaptic Stripping is easy to implement and results in sparse networks that are more efficient than the dense networks they are derived from. We conduct several ablation studies to investigate these dynamics as a function of network width and depth and we conduct an exploration of Synaptic Stripping with Vision Transformers on a variety of benchmark datasets.<br />Comment: 8 pages, 5 figures, Submitted to IJCNN-23

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2302.05818
Document Type :
Working Paper