1. CNNs Avoid the Curse of Dimensionality by Learning on Patches
- Author
-
Vamshi C. Madala, Shivkumar Chandrasekaran, and Jason Bunk
- Subjects
A priori analysis ,convolutional neural networks ,curse of dimensionality ,generalization error ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Despite the success of convolutional neural networks (CNNs) in numerous computer vision tasks and their extraordinary generalization performances, several attempts to predict the generalization errors of CNNs have only been limited to a posteriori analyses thus far. A priori theories explaining the generalization performances of deep neural networks have mostly ignored the convolutionality aspect and do not specify why CNNs are able to seemingly overcome curse of dimensionality on computer vision tasks like image classification where the image dimensions are in thousands. Our work attempts to explain the generalization performance of CNNs on image classification under the hypothesis that CNNs operate on the domain of image patches. Ours is the first work we are aware of to derive an a priori error bound for the generalization error of CNNs and we present both quantitative and qualitative evidences in the support of our theory. Our patch-based theory also offers explanation for why data augmentation techniques like Cutout, CutMix and random cropping are effective in improving the generalization error of CNNs.
- Published
- 2023
- Full Text
- View/download PDF