Back to Search Start Over

Self-Augmentation: Generalizing Deep Networks to Unseen Classes for Few-Shot Learning

Authors :
Seo, Jin-Woo
Jung, Hong-Gyu
Lee, Seong-Whan
Publication Year :
2020

Abstract

Few-shot learning aims to classify unseen classes with a few training examples. While recent works have shown that standard mini-batch training with a carefully designed training strategy can improve generalization ability for unseen classes, well-known problems in deep networks such as memorizing training statistics have been less explored for few-shot learning. To tackle this issue, we propose self-augmentation that consolidates self-mix and self-distillation. Specifically, we exploit a regional dropout technique called self-mix, in which a patch of an image is substituted into other values in the same image. Then, we employ a backbone network that has auxiliary branches with its own classifier to enforce knowledge sharing. Lastly, we present a local representation learner to further exploit a few training examples for unseen classes. Experimental results show that the proposed method outperforms the state-of-the-art methods for prevalent few-shot benchmarks and improves the generalization ability.<br />Comment: The first two authors contributed equally to this work

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2004.00251
Document Type :
Working Paper
Full Text :
https://doi.org/10.1016/j.neunet.2021.02.007