Back to Search Start Over

Fast and Memory-Efficient Import Vector Domain Description.

Authors :
Decherchi, Sergio
Cavalli, Andrea
Source :
Neural Processing Letters; Aug2020, Vol. 52 Issue 1, p511-524, 14p
Publication Year :
2020

Abstract

One-class learning is a classical and hard computational intelligence task. In the literature, there are some effective and powerful solutions to address the problem. There are examples in the kernel machines realm, Support Vector Domain Description, and the recently proposed Import Vector Domain Description (IVDD), which directly delivers the sample probability of belonging to the class. Here, we propose and discuss two optimization techniques for IVDD to significantly improve the memory footprint and consequently to scale to datasets that are larger than the original formulation. We propose two strategies. First, we propose using random features to approximate the gaussian kernel together with a primal optimization algorithm. Second, we propose a Nyström-like approximation of the functional together with a fast converging and accurate self-consistent algorithm. In particular, we replace the a posteriori sparsity of the original optimization method of IVDD by randomly selecting a priori landmark samples in the dataset. We find this second approximation to be superior. Compared to the original IVDD with the RBF kernel, it achieves high accuracy, is much faster, and grants huge memory savings. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
13704621
Volume :
52
Issue :
1
Database :
Complementary Index
Journal :
Neural Processing Letters
Publication Type :
Academic Journal
Accession number :
145079297
Full Text :
https://doi.org/10.1007/s11063-020-10243-6