Back to Search Start Over

Learning Kullback-Leibler Divergence-Based Gaussian Model for Multivariate Time Series Classification

Authors :
Gongqing Wu
Huicheng Zhang
Ying He
Xianyu Bao
Lei Li
Xuegang Hu
Source :
IEEE Access, Vol 7, Pp 139580-139591 (2019)
Publication Year :
2019
Publisher :
IEEE, 2019.

Abstract

The multivariate time series (MTS) classification is an important classification problem in which data has the temporal attribute. Because relationships between many variables of the MTS are complex and time-varying, existing methods perform not well in MTS classification with many attribute variables. Thus, in this paper, we propose a novel model-based classification method, called Kullback-Leibler Divergence-based Gaussian Model Classification (KLD-GMC), which converts the original MTS data into two important parameters of the multivariate Gaussian model: the mean vector and the inverse covariance matrix. The inverse covariance is the most important parameter, which can obtain the information between the variables. So that the more variables, the more information could be obtained by the inverse covariance, KLD-GMC can deal with the relationship between variables well in the MTS. Then the sparse inverse covariance of each subsequence is solved by Graphical Lasso. Furthermore, the Kullback-Leibler divergence is used as the similarity measurement to implement the classification of unlabeled subsequences, because it can effectively measure the similarity between different distributions. Experimental results on classical MTS datasets demonstrate that our method can improve the performance of multivariate time series classification and outperform the state-of-the-art methods.

Details

Language :
English
ISSN :
21693536
Volume :
7
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.bb734b1f8cab4c419c972c255ec26bdc
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2019.2943474