Back to Search Start Over

Rethinking Feature Distribution for Loss Functions in Image Classification

Authors :
Wan, Weitao
Zhong, Yuanyi
Li, Tianpeng
Chen, Jiansheng
Publication Year :
2018

Abstract

We propose a large-margin Gaussian Mixture (L-GM) loss for deep neural networks in classification tasks. Different from the softmax cross-entropy loss, our proposal is established on the assumption that the deep features of the training set follow a Gaussian Mixture distribution. By involving a classification margin and a likelihood regularization, the L-GM loss facilitates both a high classification performance and an accurate modeling of the training feature distribution. As such, the L-GM loss is superior to the softmax loss and its major variants in the sense that besides classification, it can be readily used to distinguish abnormal inputs, such as the adversarial examples, based on their features' likelihood to the training feature distribution. Extensive experiments on various recognition benchmarks like MNIST, CIFAR, ImageNet and LFW, as well as on adversarial examples demonstrate the effectiveness of our proposal.<br />Comment: Accepted to CVPR 2018 as spotlight

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1803.02988
Document Type :
Working Paper