1. Normalized Maximal Margin Loss for Open-Set Image Classification
- Author
-
Huazhi Sun, Bojue Wang, Jingwei Sun, Long Zhang, Donghao Wu, Chunmei Ma, and Jinqi Zhu
- Subjects
General Computer Science ,Computer science ,convolutional neural network ,02 engineering and technology ,Convolutional neural network ,Discriminative model ,Margin (machine learning) ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,inter-class distance ,open-set image classification ,Contextual image classification ,business.industry ,General Engineering ,020206 networking & telecommunications ,Pattern recognition ,Deep metric learning ,Feature (computer vision) ,Test set ,Softmax function ,Metric (mathematics) ,020201 artificial intelligence & image processing ,Artificial intelligence ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,business ,lcsh:TK1-9971 - Abstract
This work aims to address the image classification problem under open-set protocol: classes in test set do not appear in the training set. Intuitively, convolutional Neural Network (CNN) with softmax loss is a straight-forward solution. However, the unknown class (is not predefined in the training set) makes the boundaries of intra-class and inter-class more blurred, which brings more challenges for image classification. Although some softmax variants, such as center loss, CosFace loss etc., focus on learning discriminative features by minimizing the intra-class distance, they do not explicitly maximize inter-class distance, which is more important for open-set problem verified by our experiments. Besides, even though deep metric learning, such as with the contrastive loss and the triplet loss, can learn discriminative features of intra-class and inter-class, it needs a time-consuming image sampling process during training. In this paper, we propose a novel normalized maximal margin (NMM) loss for open-set image classification, which not only explicitly minimizes intra-class distance and maximizes inter-class distance, but also defines their margins. Specially, after analyzing the advantage of angular space that the softmax loss normalized by the feature and weights through geometric interpretation, we make NMM work in angular space. Then, the validity of NMM for discriminative features learning is demonstrated from the view of geometric interpretation as well. After that, we innovatively determine the upper bound of inter-class margin by theoretical analysis. Finally, extensive experiments are conducted on popular datasets: CIFAR-100 (object recognition), ImageNet (image classification), LFW (face recognition) and MSMT17 (person re-identification) to verify the effectiveness of NMM. The experimental results show that NMM achieves very competitive performance.
- Published
- 2021