1. Probabilistic Distance for Mixtures of Independent Component Analyzers.
- Author
-
Safont, Gonzalo, Salazar, Addisson, Vergara, Luis, Gomez, Enriqueta, and Villanueva, Vicente
- Subjects
INDEPENDENT component analysis ,ARTIFICIAL neural networks ,ELECTROENCEPHALOGRAPHY ,NEUROPSYCHOLOGICAL tests ,PROBABILITY density function - Abstract
Independent component analysis (ICA) is a blind source separation technique where data are modeled as linear combinations of several independent non-Gaussian sources. The independence and linear restrictions are relaxed using several ICA mixture models (ICAMMs) obtaining a two-layer artificial neural network structure. This allows for dependence between sources of different classes, and thus, a myriad of multidimensional probability density functions can be accurate modeled. This paper proposes a new probabilistic distance (PDI) between the parameters learned for two ICAMMs. The PDI is computed explicitly, unlike the popular Kullback–Leibler divergence (KLD) and other similar metrics, removing the need for numerical integration. Furthermore, the PDI is symmetric and bounded within 0 and 1, which enables its use as a posterior probability in fusion approaches. In this paper, the PDI is employed for change detection by measuring the distance between two ICAMMs learned in consecutive time windows. The changes might be associated with relevant states from a process under analysis that are explicitly reflected in the learned ICAMM parameters. The proposed distance was tested in two challenging applications using simulated and real data: 1) detecting flaws in materials using ultrasounds and 2) detecting changes in electroencephalography signals from humans performing neuropsychological tests. The results demonstrate that the PDI outperforms the KLD in change-detection capabilities. [ABSTRACT FROM PUBLISHER]
- Published
- 2018
- Full Text
- View/download PDF