Back to Search Start Over

Dynamic embeddings for efficient parameter learning of Bayesian network with multiple latent variables.

Authors :
Qi, Zhiwei
Yue, Kun
Duan, Liang
Hu, Kuang
Liang, Zhihong
Source :
Information Sciences. Apr2022, Vol. 590, p198-216. 19p.
Publication Year :
2022

Abstract

• Give a parameter learning method for BN with latent variables on dynamic embeddings. • Propose an incremental SVD method to generate dynamic embeddings. • Conduct experiments and show the efficiency and effectiveness of our methods. Latent variables (LVs), representing the unobservable abstract concepts, such as patient disease and customer credit, play an important role in the simplification of network structure and improving the interpretability of Bayesian network (BN). However, LVs incorporated into BN lead to missing probability parameters due to the missing observation. As the classic method for parameter estimation for the situation with LVs, the expectation–maximization (EM) suffers high complexity and slow convergence. To this end, we propose dynamic embeddings for parameter learning of BN with LVs. Firstly, we reconstruct the E-step of EM and propose to use the dynamic embeddings to calculate the weights of fractional samples that could reduce the computational complexity of parameter learning. Secondly, we propose to construct the point mutual information (PMI) matrix to represent directed weighted graphs (DWGs) transformed from the updated parameters. Thirdly, the incremental singular value decomposition (SVD) is adopted to generate dynamic embeddings while capturing the updated parameters and preserving BN's graphical structure. Experimental results show that our proposed methods are efficient and effective. On real-world BNs, the efficiency, convergence and accuracy of our method outperform those of the state-of-the-art methods for parameter learning of BN with multiple LVs. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
590
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
155121838
Full Text :
https://doi.org/10.1016/j.ins.2022.01.020