Back to Search
Start Over
Comparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation. NCEE 2017-4026
- Source :
-
National Center for Education Evaluation and Regional Assistance . 2017. - Publication Year :
- 2017
-
Abstract
- A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs to connect statistical methods to the building blocks of causal inference. They differ from model-based methods that have commonly been used in education research, including hierarchical linear model (HLM) methods and robust cluster standard error (RCSE) methods for clustered designs. In comparison to model-based methods, the design-based methods tend to make fewer assumptions about the nature of the data and also more explicitly account for known information about the experimental and sampling designs. While these theoretical differences suggest the corresponding estimates might differ, it is unclear how much of a practical difference it makes to use design-based methods versus more conventional model-based methods. This study addresses this question by re-analyzing nine past RCTs in the education area using both design- and model-based methods. The study uses real data, rather than simulated data, to better explore the differences that would arise in practice. In order to investigate the full scope of differences between the methods, the study uses data generated from different types of randomization designs commonly used in social policy research: (1) non-clustered designs in which individuals are randomized; (2) clustered designs in which groups are randomized; (3) non-blocked designs in which randomization is conducted for a single population; and (4) blocked (stratified) designs in which randomization is conducted separately within partitions of the sample. The study conducts the design-based analyses using "RCT-YES," a free software package funded by the Institute of Education Sciences (IES) that applies design-based methods to a wide range of RCT designs (www.rct-yes.com). This report focuses on two analyses that compare model- and design-based methods, both of which suggest there is little substantive difference in the results between the two methods. For both analyses, the study uses a reference model-based method that is similar to the one used in the original evaluation. In the first analysis, the study compares the reference model-based method to a design-based method with underlying assumptions that most closely align with those of the reference model-based method. In the second analysis, the report presents a sensitivity check that compares the reference model-based method to an alternative design-based method. In particular, the alternative method is based on the default settings in the" RCT-YES" software, which correspond to an alternative set of plausible assumptions. The findings from both analyses suggest that model- and design-based methods yield very similar results in terms of the magnitude of impact estimates, statistical significance of the impact estimates, and implications for policy. To contextualize the differences in impact estimates between design- and model-based methods, the report also presents a third analysis, which compares estimates from two commonly used model-based methods: (1) HLM methods; and (2) linear models with ordinary least squares (OLS) assumptions and RCSE to account for clustering. Importantly, this analysis suggests that the differences between the design- and model-based methods (with similar assumptions) are no greater than the differences that would arise between commonly used, model-based methods. The study suggests that researchers should select estimators with assumptions that best suit the goals of their study regardless of whether they use a design- or model-based approach. Moreover, researchers should consider the trade offs between different assumptions, and how these assumptions affect the interpretation of findings. Appended are: (1) Hierarchical linear model methods; and (2) Detailed description of studies and results. [For related reports see: "What Is Design-Based Causal Inference for RCTs and Why Should I Use It? NCEE 2017-4025" (ED575014)and "Multi-Armed RCTs: A Design-Based Framework. NCEE 2017-4027 (ED575022).]
Details
- Language :
- English
- Database :
- ERIC
- Journal :
- National Center for Education Evaluation and Regional Assistance
- Publication Type :
- Report
- Accession number :
- ED575021
- Document Type :
- Reports - Research<br />Numerical/Quantitative Data