Back to Search Start Over

Reliability of trachoma clinical grading--assessing grading of marginal cases.

Authors :
Salman A Rahman
Sun N Yu
Abdou Amza
Sintayehu Gebreselassie
Boubacar Kadri
Nassirou Baido
Nicole E Stoller
Joseph P Sheehan
Travis C Porco
Bruce D Gaynor
Jeremy D Keenan
Thomas M Lietman
Source :
PLoS Neglected Tropical Diseases, Vol 8, Iss 5, p e2840 (2014)
Publication Year :
2014
Publisher :
Public Library of Science (PLoS), 2014.

Abstract

Clinical examination of trachoma is used to justify intervention in trachoma-endemic regions. Currently, field graders are certified by determining their concordance with experienced graders using the kappa statistic. Unfortunately, trachoma grading can be highly variable and there are cases where even expert graders disagree (borderline/marginal cases). Prior work has shown that inclusion of borderline cases tends to reduce apparent agreement, as measured by kappa. Here, we confirm those results and assess performance of trainees on these borderline cases by calculating their reliability error, a measure derived from the decomposition of the Brier score.We trained 18 field graders using 200 conjunctival photographs from a community-randomized trial in Niger and assessed inter-grader agreement using kappa as well as reliability error. Three experienced graders scored each case for the presence or absence of trachomatous inflammation-follicular (TF) and trachomatous inflammation-intense (TI). A consensus grade for each case was defined as the one given by a majority of experienced graders. We classified cases into a unanimous subset if all 3 experienced graders gave the same grade. For both TF and TI grades, the mean kappa for trainees was higher on the unanimous subset; inclusion of borderline cases reduced apparent agreement by 15.7% for TF and 12.4% for TI. When we assessed the breakdown of the reliability error, we found that our trainees tended to over-call TF grades and under-call TI grades, especially in borderline cases.The kappa statistic is widely used for certifying trachoma field graders. Exclusion of borderline cases, which even experienced graders disagree on, increases apparent agreement with the kappa statistic. Graders may agree less when exposed to the full spectrum of disease. Reliability error allows for the assessment of these borderline cases and can be used to refine an individual trainee's grading.

Details

Language :
English
ISSN :
19352727 and 19352735
Volume :
8
Issue :
5
Database :
Directory of Open Access Journals
Journal :
PLoS Neglected Tropical Diseases
Publication Type :
Academic Journal
Accession number :
edsdoj.13c0fc08525b473e8305d8ecb6d9fcd0
Document Type :
article
Full Text :
https://doi.org/10.1371/journal.pntd.0002840