Back to Search Start Over

The Kappa Paradox Explained.

Authors :
Derksen, Bastiaan M.
Bruinsma, Wendy
Goslings, Johan Carel
Schep, Niels W.L.
Source :
Journal of Hand Surgery (03635023); May2024, Vol. 49 Issue 5, p482-485, 4p
Publication Year :
2024

Abstract

Observer reliability studies for fracture classification systems evaluate agreement using Cohen's κ and absolute agreement as outcome measures. Cohen's κ is a chance-corrected measure of agreement and can range between 0 (no agreement) and 1 (perfect agreement). Absolute agreement is the percentage of times observers agree on the matter they have to rate. Some studies report a high-absolute agreement but a relatively low κ value, which is counterintuitive. This phenomenon is referred to as the Kappa Paradox. The objective of this article was to explain the statistical phenomenon of the Kappa Paradox and to help readers and researchers to recognize and prevent this phenomenon. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
03635023
Volume :
49
Issue :
5
Database :
Supplemental Index
Journal :
Journal of Hand Surgery (03635023)
Publication Type :
Academic Journal
Accession number :
176783733
Full Text :
https://doi.org/10.1016/j.jhsa.2024.01.006