1. Kullback–Leibler Divergence Metric Learning
- Author
-
Zizhao Zhang, Shihui Ying, Xibin Zhao, Shuyi Ji, Yue Gao, and Liejun Wang
- Subjects
Kullback–Leibler divergence ,Similarity (geometry) ,Computer science ,02 engineering and technology ,computer.software_genre ,Matrix (mathematics) ,0202 electrical engineering, electronic engineering, information engineering ,Electrical and Electronic Engineering ,Divergence (statistics) ,0505 law ,Computer Science::Information Retrieval ,Document classification ,05 social sciences ,020207 software engineering ,Manifold ,Computer Science Applications ,Human-Computer Interaction ,Linear map ,Research Design ,Control and Systems Engineering ,Metric (mathematics) ,050501 criminology ,Method of steepest descent ,Algorithm ,computer ,Software ,Distribution (differential geometry) ,Information Systems - Abstract
The Kullback-Leibler divergence (KLD), which is widely used to measure the similarity between two distributions, plays an important role in many applications. In this article, we address the KLD metric-learning task, which aims at learning the best KLD-type metric from the distributions of datasets. Concretely, first, we extend the conventional KLD by introducing a linear mapping and obtain the best KLD to well express the similarity of data distributions by optimizing such a linear mapping. It improves the expressivity of data distribution, which means it makes the distributions in the same class close and those in different classes far away. Then, the KLD metric learning is modeled by a minimization problem on the manifold of all positive-definite matrices. To deal with this optimization task, we develop an intrinsic steepest descent method, which preserves the manifold structure of the metric in the iteration. Finally, we apply the proposed method along with ten popular metric-learning approaches on the tasks of 3-D object classification and document classification. The experimental results illustrate that our proposed method outperforms all other methods.
- Published
- 2022