1. Efficient-PrototypicalNet with self knowledge distillation for few-shot learning
- Author
-
Jit Yan Lim, Chin Poo Lee, Shih Yin Ooi, and Kian Ming Lim
- Subjects
Contextual image classification ,business.industry ,Computer science ,Cognitive Neuroscience ,Machine learning ,computer.software_genre ,Computer Science Applications ,Task (computing) ,Artificial Intelligence ,Metric (mathematics) ,Benchmark (computing) ,Feature (machine learning) ,Generalizability theory ,Artificial intelligence ,Performance improvement ,Transfer of learning ,business ,computer - Abstract
The focus of recent few-shot learning research has been on the development of learning methods that can quickly adapt to unseen tasks with small amounts of data and low computational cost. In order to achieve higher performance in few-shot learning tasks, the generalizability of the method is essential to enable it generalize well from seen tasks to unseen tasks with limited number of samples. In this work, we investigate a new metric-based few-shot learning framework which transfers the knowledge from another effective classification model to produce well generalized embedding and improve the effectiveness in handling unseen tasks. The idea of our proposed Efficient-PrototypicalNet involves transfer learning, knowledge distillation, and few-shot learning. We employed a pre-trained model as a feature extractor to obtain useful features from tasks and decrease the task complexity. These features reduce the training difficulty in few-shot learning and increase the performance. Besides that, we further apply knowledge distillation to our framework and achieve extra performance improvement. The proposed Efficient-PrototypicalNet was evaluated on five benchmark datasets, i.e., Omniglot, miniImageNet, tieredImageNet, CIFAR-FS, and FC100. The proposed Efficient-PrototypicalNet achieved the state-of-the-art performance on most datasets in the 5-way K-shot image classification task, especially on the miniImageNet dataset.
- Published
- 2021