Back to Search
Start Over
Forest Learning From Data and its Universal Coding.
- Source :
- IEEE Transactions on Nuclear Science; Nov2018, Vol. 64 Issue 11, p7349-7358, 10p
- Publication Year :
- 2018
-
Abstract
- This paper considers structure learning from data with $n$ samples of $p$ variables, assuming that the structure is a forest, using the Chow–Liu algorithm. Specifically, for incomplete data, we construct two model selection algorithms that complete in $O(p^{2})$ steps: one obtains a forest with the maximum posterior probability given the data and the other obtains a forest that converges to the true one as $n$ increases. We show that the two forests are generally different when some values are missing. In addition, we present estimations for benchmark data sets to demonstrate that both algorithms work in realistic situations. Moreover, we derive the conditional entropy provided that no value is missing, and we evaluate the per-sample expected redundancy for the universal coding of incomplete data in terms of the number of non-missing samples. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00189499
- Volume :
- 64
- Issue :
- 11
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Nuclear Science
- Publication Type :
- Academic Journal
- Accession number :
- 132604458
- Full Text :
- https://doi.org/10.1109/TIT.2018.2869215