Back to Search Start Over

Distance Measure Machines

Authors :
Rakotomamonjy, Alain
Traoré, Abraham
Berar, Maxime
Flamary, Rémi
Courty, Nicolas
Publication Year :
2018

Abstract

This paper presents a distance-based discriminative framework for learning with probability distributions. Instead of using kernel mean embeddings or generalized radial basis kernels, we introduce embeddings based on dissimilarity of distributions to some reference distributions denoted as templates. Our framework extends the theory of similarity of Balcan et al. (2008) to the population distribution case and we show that, for some learning problems, some dissimilarity on distribution achieves low-error linear decision functions with high probability. Our key result is to prove that the theory also holds for empirical distributions. Algorithmically, the proposed approach consists in computing a mapping based on pairwise dissimilarity where learning a linear decision function is amenable. Our experimental results show that the Wasserstein distance embedding performs better than kernel mean embeddings and computing Wasserstein distance is far more tractable than estimating pairwise Kullback-Leibler divergence of empirical distributions.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1803.00250
Document Type :
Working Paper