Back to Search Start Over

Validity and Reproducibility of Immunohistochemical Scoring by Trained Non-Pathologists on Tissue Microarrays.

Authors :
Jenniskens JCA
Offermans K
Samarska I
Fazzi GE
Simons CCJM
Smits KM
Schouten LJ
Weijenberg MP
van den Brandt PA
Grabsch HI
Source :
Cancer epidemiology, biomarkers & prevention : a publication of the American Association for Cancer Research, cosponsored by the American Society of Preventive Oncology [Cancer Epidemiol Biomarkers Prev] 2021 Oct; Vol. 30 (10), pp. 1867-1874. Date of Electronic Publication: 2021 Jul 16.
Publication Year :
2021

Abstract

Background: Scoring of immunohistochemistry (IHC) staining is often done by non-pathologists, especially in large-scale tissue microarray (TMA)-based studies. Studies on the validity and reproducibility of scoring results from non-pathologists are limited. Therefore, our main aim was to assess interobserver agreement between trained non-pathologists and an experienced histopathologist for three IHC markers with different subcellular localization (nucleus/membrane/cytoplasm).<br />Methods: Three non-pathologists were trained in recognizing adenocarcinoma and IHC scoring by a senior histopathologist. Kappa statistics were used to analyze interobserver and intraobserver agreement for 6,249 TMA cores from a colorectal cancer series.<br />Results: Interobserver agreement between non-pathologists (independently scored) and the histopathologist was "substantial" for nuclear and membranous IHC markers (κ <subscript>range</subscript> = 0.67-0.75 and κ <subscript>range</subscript> = 0.61-0.69, respectively), and "moderate" for the cytoplasmic IHC marker (κ <subscript>range</subscript> = 0.43-0.57). Scores of the three non-pathologists were also combined into a "combination score " (if at least two non-pathologists independently assigned the same score to a core, this was the combination score). This increased agreement with the pathologist (κ <subscript>nuclear</subscript> = 0.74; κ <subscript>membranous</subscript> = 0.73; κ <subscript>cytopasmic</subscript> = 0.57). Interobserver agreement between non-pathologists was "substantial" (κ <subscript>nuclear</subscript> = 0.78; κ <subscript>membranous</subscript> = 0.72; κ <subscript>cytopasmic</subscript> = 0.61). Intraobserver agreement of non-pathologists was "substantial" to "almost perfect" (κ <subscript>nuclear,range</subscript> = 0.83-0.87; κ <subscript>membranous,range</subscript> = 0.75-0.82; κ <subscript>cytopasmic</subscript> = 0.69). Overall, agreement was lowest for the cytoplasmic IHC marker.<br />Conclusions: This study shows that adequately trained non-pathologists are able to generate reproducible IHC scoring results, that are similar to those of an experienced histopathologist. A combination score of at least two non-pathologists yielded optimal results.<br />Impact: Non-pathologists can generate reproducible IHC results after appropriate training, making analyses of large-scale molecular pathological epidemiology studies feasible within an acceptable time frame.<br /> (©2021 American Association for Cancer Research.)

Details

Language :
English
ISSN :
1538-7755
Volume :
30
Issue :
10
Database :
MEDLINE
Journal :
Cancer epidemiology, biomarkers & prevention : a publication of the American Association for Cancer Research, cosponsored by the American Society of Preventive Oncology
Publication Type :
Academic Journal
Accession number :
34272264
Full Text :
https://doi.org/10.1158/1055-9965.EPI-21-0295