Back to Search Start Over

Measurement of interrater agreement with adjustment for covariates.

Authors :
Barlow W
Source :
Biometrics [Biometrics] 1996 Jun; Vol. 52 (2), pp. 695-702.
Publication Year :
1996

Abstract

The kappa coefficient measures chance-corrected agreement between two observers in the dichotomous classification of subjects. The marginal probability of classification by each rater may depend on one or more confounding variables, however. Failure to account for these confounders may lead to inflated estimates of agreement. A multinomial model is used that assumes both raters have the same marginal probability of classification, but this probability may depend on one or more covariates. The model may be fit using software for conditional logistic regression. Additionally, likelihood-based confidence intervals for the parameter representing agreement may be computed. A simple example is discussed to illustrate model-fitting and application of the technique.

Details

Language :
English
ISSN :
0006-341X
Volume :
52
Issue :
2
Database :
MEDLINE
Journal :
Biometrics
Publication Type :
Academic Journal
Accession number :
10766505