Back to Search Start Over

Scale-covariant and scale-invariant Gaussian derivative networks

Publication Year :
2021

Abstract

This paper presents a hybrid approach between scale-space theory and deep learning, where a deep learning architecture is constructed by coupling parameterized scale-space operations in cascade. By sharing the learnt parameters between multiple scale channels, and by using the transformation properties of the scale-space primitives under scaling transformations, the resulting network becomes provably scale covariant. By in addition performing max pooling over the multiple scale channels, a resulting network architecture for image classification also becomes provably scale invariant. We investigate the performance of such networks on the MNISTLargeScale dataset, which contains rescaled images from original MNIST over a factor of 4 concerning training data and over a factor of 16 concerning testing data. It is demonstrated that the resulting approach allows for scale generalization, enabling good performance for classifying patterns at scales not spanned by the training data.<br />Part of proceedings: ISBN 978-3-030-75548-5Not duplicate with DiVA 1505585QC 20210317<br />Scale-space theory for covariant and invariant visual perception

Details

Database :
OAIster
Notes :
Lindeberg, Tony
Publication Type :
Electronic Resource
Accession number :
edsoai.on1248708550
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1007.978-3-030-75549-2_1