Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible

The social sciences—including economics—have long called for transparency in research to counter threats to producing robust and replicable results. In this paper, we discuss the pros and cons of three of the more prominent proposed approaches: pre-analysis plans, hypothesis registries, and replications. They have been primarily discussed for experimental research, both in the field including randomized control trials and the laboratory, so we focus on these areas. A pre-analysis plan is a credibly fixed plan of how a researcher will collect and analyze data, which is submitted before a project begins. Though pre-analysis plans have been lauded in the popular press and across the social sciences, we will argue that enthusiasm for pre-analysis plans should be tempered for several reasons. Hypothesis registries are a database of all projects attempted; the goal of this promising mechanism is to alleviate the "file drawer problem," which is that statistically significant results are more likely to be published, while other results are consigned to the researcher's "file drawer." Finally, we evaluate the efficacy of replications. We argue that even with modest amounts of researcher bias—either replication attempts bent on proving or disproving the published work, or poor replication attempts—replications correct even the most inaccurate beliefs within three to five replications. We offer practical proposals for how to increase the incentives for researchers to carry out replications.

[1]  James E. Monogan A Case for Registering Studies of Political Outcomes: An Application in the 2010 House Elections , 2013, Political Analysis.

[2]  Jeffrey R. Spies,et al.  The Replication Recipe: What Makes for a Convincing Replication? , 2014 .

[3]  Abel Brodeur,et al.  Star Wars: The Empirics Strike Back , 2012, SSRN Electronic Journal.

[4]  Andrew Gelman,et al.  Preregistration of Studies and Mock Reports , 2013, Political Analysis.

[5]  Leif D. Nelson,et al.  False-Positive Psychology , 2011, Psychological science.

[6]  J. Kagel,et al.  Other Regarding Preferences: A Selective Survey of Experimental Results , 2012 .

[7]  Michael McAleer,et al.  What Will Take the Con out of Econometrics , 1985 .

[8]  Neil Malhotra,et al.  Publication bias in the social sciences: Unlocking the file drawer , 2014, Science.

[9]  Edward Miguel,et al.  Reshaping Institutions: Evidence on Aid Impacts Using a Pre-Analysis Plan , 2011 .

[10]  J. Brooks Why most published research findings are false: Ioannidis JP, Department of Hygiene and Epidemiology, University of Ioannina School of Medicine, Ioannina, Greece , 2008 .

[11]  W. Güth,et al.  An experimental analysis of ultimatum bargaining , 1982 .

[12]  Macartan Humphreys,et al.  Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration , 2012, Political Analysis.

[13]  G. Loewenstein,et al.  Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling , 2012, Psychological science.

[14]  Leif D. Nelson,et al.  P-Curve: A Key to the File Drawer , 2013, Journal of experimental psychology. General.

[15]  Alvin E. Roth,et al.  Lets keep the con out of experimental a methodological note , 1994 .

[16]  Brian A. Nosek,et al.  Promoting Transparency in Social Science Research , 2014, Science.

[17]  J. Kagel,et al.  Handbook of Experimental Economics , 1997 .