Back to Search Start Over

Towards Efficient Capsule Networks

Authors :
Renzulli, Riccardo
Grangetto, Marco
Publication Year :
2022

Abstract

From the moment Neural Networks dominated the scene for image processing, the computational complexity needed to solve the targeted tasks skyrocketed: against such an unsustainable trend, many strategies have been developed, ambitiously targeting performance's preservation. Promoting sparse topologies, for example, allows the deployment of deep neural networks models on embedded, resource-constrained devices. Recently, Capsule Networks were introduced to enhance explainability of a model, where each capsule is an explicit representation of an object or its parts. These models show promising results on toy datasets, but their low scalability prevents deployment on more complex tasks. In this work, we explore sparsity besides capsule representations to improve their computational efficiency by reducing the number of capsules. We show how pruning with Capsule Network achieves high generalization with less memory requirements, computational effort, and inference and training time.<br />Comment: Accepted at ICIP 2022 Special Session SCENA: Simplification, Compression and Efficiency with Neural networks and Artificial intelligence

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2208.09203
Document Type :
Working Paper