Back to Search Start Over

Mitigating Uncertainty in Document Classification

Authors :
Zhang, Xuchao
Chen, Fanglan
Lu, Chang-Tien
Ramakrishnan, Naren
Publication Year :
2019

Abstract

The uncertainty measurement of classifiers' predictions is especially important in applications such as medical diagnoses that need to ensure limited human resources can focus on the most uncertain predictions returned by machine learning models. However, few existing uncertainty models attempt to improve overall prediction accuracy where human resources are involved in the text classification task. In this paper, we propose a novel neural-network-based model that applies a new dropout-entropy method for uncertainty measurement. We also design a metric learning method on feature representations, which can boost the performance of dropout-based uncertainty methods with smaller prediction variance in accurate prediction trials. Extensive experiments on real-world data sets demonstrate that our method can achieve a considerable improvement in overall prediction accuracy compared to existing approaches. In particular, our model improved the accuracy from 0.78 to 0.92 when 30\% of the most uncertain predictions were handed over to human experts in "20NewsGroup" data.<br />Comment: Accepted by NAACL19

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1907.07590
Document Type :
Working Paper