Back to Search Start Over

Quantitative bias analysis for external control arms using real-world data in clinical trials: a primer for clinical researchers.

Authors :
Thorlund K
Duffield S
Popat S
Ramagopalan S
Gupta A
Hsu G
Arora P
Subbiah V
Source :
Journal of comparative effectiveness research [J Comp Eff Res] 2024 Mar; Vol. 13 (3), pp. e230147. Date of Electronic Publication: 2024 Jan 11.
Publication Year :
2024

Abstract

Development of medicines in rare oncologic patient populations are growing, but well-powered randomized controlled trials are typically extremely challenging or unethical to conduct in such settings. External control arms using real-world data are increasingly used to supplement clinical trial evidence where no or little control arm data exists. The construction of an external control arm should always aim to match the population, treatment settings and outcome measurements of the corresponding treatment arm. Yet, external real-world data is typically fraught with limitations including missing data, measurement error and the potential for unmeasured confounding given a nonrandomized comparison. Quantitative bias analysis (QBA) comprises a collection of approaches for modelling the magnitude of systematic errors in data which cannot be addressed with conventional statistical adjustment. Their applications can range from simple deterministic equations to complex hierarchical models. QBA applied to external control arm represent an opportunity for evaluating the validity of the corresponding comparative efficacy estimates. We provide a brief overview of available QBA approaches and explore their application in practice. Using a motivating example of a comparison between pralsetinib single-arm trial data versus pembrolizumab alone or combined with chemotherapy real-world data for RET fusion-positive advanced non-small cell lung cancer (aNSCLC) patients (1-2% among all NSCLC), we illustrate how QBA can be applied to external control arms. We illustrate how QBA is used to ascertain robustness of results despite a large proportion of missing data on baseline ECOG performance status and suspicion of unknown confounding. The robustness of findings is illustrated by showing that no meaningful change to the comparative effect was observed across several 'tipping-point' scenario analyses, and by showing that suspicion of unknown confounding was ruled out by use of E-values. Full R code is also provided.

Details

Language :
English
ISSN :
2042-6313
Volume :
13
Issue :
3
Database :
MEDLINE
Journal :
Journal of comparative effectiveness research
Publication Type :
Academic Journal
Accession number :
38205741
Full Text :
https://doi.org/10.57264/cer-2023-0147