OBELS, Pepijn, Daniël LAKENS, Nicolas COLES, Jaroslav GOTTFRIED and Seth GREEN. Analysis of Open Data and Computational Reproducibility in Registered Reports in Psychology. Advances in Methods and Practices in Psychological Science. 2020, vol. 3, No 2, p. 229-237, 8 pp. ISSN 2515-2459. Available from: https://dx.doi.org/10.1177/2515245920918872.
Other formats:   BibTeX LaTeX RIS
Basic information
Original name Analysis of Open Data and Computational Reproducibility in Registered Reports in Psychology
Authors OBELS, Pepijn, Daniël LAKENS, Nicolas COLES, Jaroslav GOTTFRIED and Seth GREEN.
Edition Advances in Methods and Practices in Psychological Science, 2020, 2515-2459.
Other information
Type of outcome Article in a journal
Confidentiality degree is not subject to a state or trade secret
WWW URL
Doi http://dx.doi.org/10.1177/2515245920918872
Keywords in English reproducibility, Registered Reports, data sharing, open science, open data, open materials
Tags Reviewed
Changed by Changed by: Mgr. Jaroslav Gottfried, učo 393127. Changed: 1/9/2020 14:36.
Abstract
Ongoing technological developments have made it easier than ever before for scientists to share their data, materials, and analysis code. Sharing data and analysis code makes it easier for other researchers to reuse or check published research. However, these benefits will emerge only if researchers can reproduce the analyses reported in published articles and if data are annotated well enough so that it is clear what all variable and value labels mean. Because most researchers are not trained in computational reproducibility, it is important to evaluate current practices to identify those that can be improved. We examined data and code sharing for Registered Reports published in the psychological literature from 2014 to 2018 and attempted to independently computationally reproduce the main results in each article. Of the 62 articles that met our inclusion criteria, 41 had data available, and 37 had analysis scripts available. Both data and code for 36 of the articles were shared. We could run the scripts for 31 analyses, and we reproduced the main results for 21 articles. Although the percentage of articles for which both data and code were shared (36 out of 62, or 58%) and the percentage of articles for which main results could be computationally reproduced (21 out of 36, or 58%) were relatively high compared with the percentages found in other studies, there is clear room for improvement. We provide practical recommendations based on our observations and cite examples of good research practices in the studies whose main results we reproduced.
PrintDisplayed: 28/7/2024 23:22