Back to Search Start Over

Class-specific discriminative metric learning for scene recognition.

Authors :
Wang, Chen
Peng, Guohua
De Baets, Bernard
Source :
Pattern Recognition. Jun2022, Vol. 126, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• Learn class-specific discriminative distance metrics for different classes. • Incorporate least squares regression to relax the class-specific metric learning. • Extensive experiments demonstrate the effectiveness of our method. Metric learning aims to learn an appropriate distance metric for a given machine learning task. Despite its impressive performance in the field of image recognition, it may still not be discriminative enough for scene recognition because of the high within-class diversity and high between-class similarity of scene images. In this paper, we propose a novel class-specific discriminative metric learning method (CSDML) to alleviate these problems. More specifically, we learn a distinctive linear transformation for each class (or, equivalently, a Mahalanobis distance metric for each class), which allows to project the samples of that class into a corresponding low-dimensional discriminative space. The overall aim is to simultaneously minimize the Euclidean distances between the projections of samples of the same class (or, equivalently, the Mahalanobis distances between these samples) and maximize the Euclidean distances between the projections of samples of different classes. Additionally, we incorporate least squares regression into the optimization problem, rendering class-specific metric learning more flexible and better suited to tackle scene recognition. Experimental results on four benchmark scene datasets demonstrate that the proposed method outperforms most of the state-of-the-art approaches. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
126
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
155815082
Full Text :
https://doi.org/10.1016/j.patcog.2022.108589