1. How to Open Science: Debugging Reproducibility within the Educational Data Mining Conference
- Author
-
Haim, Aaron, Gyurcsan, Robert, Baxter, Chris, Shaw, Stacy T., and Heffernan, Neil T.
- Abstract
Despite increased efforts to assess the adoption rates of open science and robustness of reproducibility in sub-disciplines of education technology, there is a lack of understanding of why some research is not reproducible. Prior work has taken the first step toward assessing reproducibility of research, but has assumed certain constraints which hinder its discovery. Thus, the purpose of this study was to replicate previous work on papers within the proceedings of the "International Conference on Educational Data Mining" to accurately report on which papers are reproducible and why. Specifically, we examined 208 papers, attempted to reproduce them, documented reasons for reproducibility failures, and asked authors to provide additional information needed to reproduce their study. Our results showed that out of 12 papers that were potentially reproducible, only one successfully reproduced all analyses, and another two reproduced most of the analyses. The most common failure for reproducibility was failure to mention libraries needed, followed by non-seeded randomness. [For the complete proceedings, see ED630829. Additional funding for this paper was provided by the U.S. Department of Education's Graduate Assistance in Areas of National Need (GAANN).]
- Published
- 2023