Back to Search
Start Over
Flexible model weighting for one-dependence estimators based on point-wise independence analysis.
- Source :
-
Pattern Recognition . Jul2023, Vol. 139, pN.PAG-N.PAG. 1p. - Publication Year :
- 2023
-
Abstract
- • We prove theoretically and exemplify that information-theoretic metrics, e.g., MI or CMI, are just the mathematical expectations of corresponding probability-theoretic metrics, and the non-negativity of information-theoretic metrics doesn't hold while dealing with specific instances. The probability-theoretic metrics are introduced to verify the independence assumptions and then applied to reduce structure complexity and computational complexity. • To alleviate the i.i.d assumption and achieve global optimum, the pointwise log likelihood function P L L (B | t) is introduced to measure the number of bits encoded in the network topology of B learned from specific instance t, and then assigned to SPODE members in AODE as self-adaptive weighting metric. This learning approach can help approximate the true joint probability and improve the generalization performance. • We compare the performance of our classifier with other state-of-the-art classifiers on 36 datasets, ranging in size from 57 to 1,025,010 instances and 3 to 64 attributes. We show that our algorithm achieves remarkable classification performance in terms of zero-one loss, RMSE, bias-variance decomposition and conditional log likelihood (CLL) function. Recent studies have shown that Bayesian network classifiers (BNCs) are powerful tools for knowledge representation and classification, and averaged one-dependence estimators (AODE) is one of the most popular and effective BNCs since it can achieve the tradeoff between bias and variance due to its independence assumptions and ensemble learning strategy. However, unverified independence assumptions may result in biased estimates of probability distribution and then degradation in classification performance. In this paper, we prove theoretically the uncertainty of probability-theoretic independence and propose to measure the independence between attribute values implicated in specific instance. The estimates of conditional probability can be finely tuned based on point-wise independence analysis. Point-wise log likelihood function is then applied as weighting metric for committee members of AODE to improve the estimate of joint probability. Extensive experiments on 36 benchmark datasets show that, compared to other state-of-the-art classifiers, weighted one-dependence estimators using point-wise independence analysis can achieve competitive classification performance in terms of zero-one loss, RMSE, bias-variance decomposition and conditional log likelihood. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00313203
- Volume :
- 139
- Database :
- Academic Search Index
- Journal :
- Pattern Recognition
- Publication Type :
- Academic Journal
- Accession number :
- 162848510
- Full Text :
- https://doi.org/10.1016/j.patcog.2023.109473