Randomly Auditing Research Labs Could Be an Affordable Way to Improve Research Quality: A Simulation Study
Overview
Affiliations
The "publish or perish" incentive drives many researchers to increase the quantity of their papers at the cost of quality. Lowering quality increases the number of false positive errors which is a key cause of the reproducibility crisis. We adapted a previously published simulation of the research world where labs that produce many papers are more likely to have "child" labs that inherit their characteristics. This selection creates a competitive spiral that favours quantity over quality. To try to halt the competitive spiral we added random audits that could detect and remove labs with a high proportion of false positives, and also improved the behaviour of "child" and "parent" labs who increased their effort and so lowered their probability of making a false positive error. Without auditing, only 0.2% of simulations did not experience the competitive spiral, defined by a convergence to the highest possible false positive probability. Auditing 1.35% of papers avoided the competitive spiral in 71% of simulations, and auditing 1.94% of papers in 95% of simulations. Audits worked best when they were only applied to established labs with 50 or more papers compared with labs with 25 or more papers. Adding a ±20% random error to the number of false positives to simulate peer reviewer error did not reduce the audits' efficacy. The main benefit of the audits was via the increase in effort in "child" and "parent" labs. Audits improved the literature by reducing the number of false positives from 30.2 per 100 papers to 12.3 per 100 papers. Auditing 1.94% of papers would cost an estimated $15.9 million per year if applied to papers produced by National Institutes of Health funding. Our simulation greatly simplifies the research world and there are many unanswered questions about if and how audits would work that can only be addressed by a trial of an audit.
Fitzpatrick B, Gorman D, Trombatore C PLoS One. 2024; 19(5):e0303262.
PMID: 38753677 PMC: 11098386. DOI: 10.1371/journal.pone.0303262.
Shifting the Level of Selection in Science.
Tiokhin L, Panchanathan K, Smaldino P, Lakens D Perspect Psychol Sci. 2023; 19(6):908-920.
PMID: 37526118 PMC: 11539478. DOI: 10.1177/17456916231182568.
Replication of the natural selection of bad science.
Kohrt F, Smaldino P, McElreath R, Schonbrodt F R Soc Open Sci. 2023; 10(2):221306.
PMID: 36844805 PMC: 9943874. DOI: 10.1098/rsos.221306.
Improving quality of preclinical academic research through auditing: A feasibility study.
Kurreck C, Castanos-Velez E, Freyer D, Blumenau S, Przesdzing I, Bernard R PLoS One. 2020; 15(10):e0240719.
PMID: 33057427 PMC: 7561085. DOI: 10.1371/journal.pone.0240719.
Smaldino P, Turner M, Contreras Kallens P R Soc Open Sci. 2019; 6(8):191249.
PMID: 31543978 PMC: 6731693. DOI: 10.1098/rsos.191249.