Back to Search Start Over

Scale-covariant and scale-invariant Gaussian derivative networks

Publication Year :
2022

Abstract

This paper presents a hybrid approach between scale-space theory and deep learning, where a deep learning architecture is constructed by coupling parameterized scale-space operations in cascade. By sharing the learnt parameters between multiple scale channels, and by using the transformation properties of the scale-space primitives under scaling transformations, the resulting network becomes provably scale covariant. By in addition performing max pooling over the multiple scale channels, or other permutation-invariant pooling over scales, a resulting network architecture for image classification also becomes provably scale invariant. We investigate the performance of such networks on the MNIST Large Scale dataset, which contains rescaled images from the original MNISTdataset over a factor of 4 concerning training data and over a factor of 16 concerning testing data. It is demonstrated that the resulting approach allows for scale generalization, enabling good performance for classifying patterns at scales not spanned by the training data.<br />QC 20211021<br />Scale-space theory for covariant and invariant visual perception

Details

Database :
OAIster
Notes :
Lindeberg, Tony
Publication Type :
Electronic Resource
Accession number :
edsoai.on1312721668
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1007.s10851-021-01057-9