Back to Search Start Over

An investigation on mutual information for the linear predictive system and the extrapolation of speech signals

Publication Year :
2020

Abstract

Mutual information (MI) is an important information theoretic concept which has many applications in telecommunications, in blind source separation, and in machine learning. More recently, it has been also employed for the instrumental assessment of speech intelligibility where traditionally correlation based measures are used. In this paper, we address the difference between MI and correlation from the viewpoint of discovering dependencies between variables in the context of speech signals. We perform our investigation by considering the linear predictive approximation and the extrapolation of speech signals as examples. We compare a parametric MI estimation approach based on a Gaussian mixture model (GMM) with the k-nearest neighbor (KNN) approach which is a well-known non-parametric method available to estimate the MI. We show that the GMM-based MI estimator leads to more consistent results. © 2020 Sprachkommunikation - 10. ITG-Fachtagung. All rights reserved.<br />QC 20201202

Details

Database :
OAIster
Notes :
Taghia, Jalil, Martin, Rainer, Leijon, Arne
Publication Type :
Electronic Resource
Accession number :
edsoai.on1235094336
Document Type :
Electronic Resource