Back to Search Start Over

Making up the shortages of the Bayes classifier by the maximum mutual information classifier

Authors :
Xiaohui Zou
Chenguang Lu
Wenfeng Wang
Xiaofeng Chen
Source :
The Journal of Engineering. 2020:659-663
Publication Year :
2020
Publisher :
Institution of Engineering and Technology (IET), 2020.

Abstract

The Bayes classifier is often used because it is simple, and the maximum posterior probability (MPP) criterion it uses is equivalent to the least error rate criterion. However, it has issues in the following circumstances: (i) if information instead of correctness is more important, we should use the maximum likelihood criterion or maximum information criterion, which can reduce the rate of failure to report small probability events. (ii) For unseen instance classifications, the previously optimised classifier cannot be properly used when the probability distribution of true classes is changed. (iii) When classes’ feature distributions instead of transition probability functions (TPFs) are stable, it is improper to train the TPF, such as the logistic function, with parameters. (iv) For multi-label classifications, it is difficult to optimise a group of TPFs with parameters that the Bayes classifier needs. This study addresses these issues by comparing the MPP criterion with the maximum likelihood criterion and maximum mutual information (MMI) criterion. It suggests using the MMI criterion for most unseen instance classifications. It presents a new iterative algorithm, the channel matching (CM) algorithm, for the MMI classification. It uses two examples to show the advantages of the CM algorithm: fast and reliable.

Details

ISSN :
20513305
Volume :
2020
Database :
OpenAIRE
Journal :
The Journal of Engineering
Accession number :
edsair.doi...........c9ef295c9a991dab78c47e5bff5aa5f3