Back to Search Start Over

Exact one-sided confidence limits for Cohen's kappa as a measurement of agreement.

Authors :
Shan, Guogen
Wang, Weizhen
Source :
Statistical Methods in Medical Research. Apr2017, Vol. 26 Issue 2, p615-632. 18p.
Publication Year :
2017

Abstract

Cohen's kappa coefficient, κ, is a statistical measure of inter-rater agreement or inter-annotator agreement for qualitative items. In this paper, we focus on interval estimation of κ in the case of two raters and binary items. So far, only asymptotic and bootstrap intervals are available for κ due to its complexity. However, there is no guarantee that such intervals will capture κ with the desired nominal level 1- α. In other words, the statistical inferences based on these intervals are not reliable. We apply the Buehler method to obtain exact confidence intervals based on four widely used asymptotic intervals, three Wald-type confidence intervals and one interval constructed from a profile variance. These exact intervals are compared with regard to coverage probability and length for small to medium sample sizes. The exact intervals based on the Garner interval and the Lee and Tu interval are generally recommended for use in practice due to good performance in both coverage probability and length. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09622802
Volume :
26
Issue :
2
Database :
Academic Search Index
Journal :
Statistical Methods in Medical Research
Publication Type :
Academic Journal
Accession number :
122467134
Full Text :
https://doi.org/10.1177/0962280214552881