1. Transparent reporting items for simulation studies evaluating statistical methods: Foundations for reproducibility and reliability
- Author
-
Coralie Williams, Yefeng Yang, Malgorzata Lagisz, Kyle Morrison, Lorenzo Ricolfi, David I. Warton, and Shinichi Nakagawa
- Subjects
meta‐research ,Monte Carlo simulation ,replicability ,reporting checklist ,reporting quality ,reproducibility ,Ecology ,QH540-549.5 ,Evolution ,QH359-425 - Abstract
Abstract Simulation studies are essential tools to assess statistical methods. Functioning as controlled experiments, simulations generate data from known underlying processes. However, unclear or incomplete reporting of simulation studies can impact their interpretability and reproducibility, potentially leading to the misuse of statistical methods. While Morris et al. (2019, Stat Med, 38, p. 2074) recently provided guidance on the planning and conduct of simulation studies for statistical method evaluation, there is currently no comprehensive set of reporting guidelines in ecology and evolutionary biology. Here, we propose 11 reporting items for statistical simulation studies extending on Morris and colleagues' guidance. These items span across three stages: planning, coding and analysis. We also clarify the terminology related to statistical components and the broad purposes of statistical simulation studies. To highlight our proposed reporting items with current practices, we surveyed 100 articles in ecology and evolution journals that included a simulation study evaluating a statistical method. Our survey found room for improvement in more transparent reporting to ensure clear evaluation of statistical methods. Most notably, only a small proportion of articles reported a Monte Carlo uncertainty (17%; 17 out of 98), and 32% (32 out of 100) articles did not provide code. Beyond the proposed reporting items, we discuss the benefits of open science tools to enhance the reproducibility of simulation studies. Specifically, we propose the registration of statistical simulation studies to enhance planning, reporting and collaboration. We aim to instigate discussions to improve the reporting of simulation studies for statistical method research. The reporting items we propose, along with open science tools, serve as a template for developing standards and guidelines for simulation studies evaluating statistical methods. These reporting guidelines will help enhance reproducibility and indirectly encourage more consideration in the design and conduct of these simulation studies.
- Published
- 2024
- Full Text
- View/download PDF