Transparency, Replication, and Cumulative Learning: What Experiments Alone Cannot Achieve

Replication of simple and transparent experiments should promote the cumulation of knowledge. Yet, randomization alone does not guarantee simple analysis, transparent reporting, or third-party replication. This article surveys several challenges to cumulative learning from experiments and discusses emerging research practices—including several kinds of prespecification, two forms of replication, and a new model for coordinated experimental research—that may partially overcome the obstacles. I reflect on both the strengths and limitations of these new approaches to doing social science research. NOTE: This article has been corrected and republished as doi:10.1146/annurev-polisci-072516-014127.

[1]  T. Palfrey Laboratory Experiments in Political Economy , 2009 .

[2]  Vincent L. Hutchings,et al.  Experiments on Racial Priming in Political Campaigns , 2009 .

[3]  Susan D. Hyde Experiments in International Relations: Lab, Survey, and Field , 2015 .

[4]  Alan S. Gerber,et al.  Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals , 2008 .

[5]  Kurt Weyland,et al.  Rethinking Social Inquiry: Diverse Tools, Shared Standards , 2005, Perspectives on Politics.

[6]  Charles R. Plott Experimental Methods in Economics and Political Science: The Design and Testing of Policy Options , 1994 .

[7]  James E. Monogan A Case for Registering Studies of Political Outcomes: An Application in the 2010 House Elections , 2013, Political Analysis.

[8]  Donald P. Green,et al.  Field Experiments: Design, Analysis, and Interpretation , 2012 .

[9]  James E. Monogan Research Preregistration in Political Science: The Case, Counterarguments, and a Response to Critiques , 2015, PS: Political Science & Politics.

[10]  Abhijit Banerjee,et al.  The Experimental Approach to Development Economics , 2008 .

[11]  Jon Stiles,et al.  The Impact of California's Cal-Learn Demonstration Project, Final Report , 2000 .

[12]  Leif D. Nelson,et al.  P-Curve: A Key to the File Drawer , 2013, Journal of experimental psychology. General.

[13]  Teresa D. Harrison,et al.  Lessons from the JMCB Archive , 2006 .

[14]  Nancy Cartwright,et al.  Evidence-Based Policy: A Practical Guide to Doing It Better , 2012 .

[15]  Charles C. Ragin,et al.  Fuzzy-Set Social Science , 2001 .

[16]  D. Green,et al.  Field Experiments on Political Behavior and Collective Action , 2009 .

[17]  W. Aron,et al.  Fisheries Management , 2013, Political Analysis.

[18]  Donald P. Green,et al.  Testing for Publication Bias in Political Science , 2001, Political Analysis.

[19]  Alexia Katsanidou,et al.  Data Availability in Political Science Journals , 2013 .

[20]  Macartan Humphreys,et al.  Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration , 2012, Political Analysis.

[21]  Peter van der Windt,et al.  Social and Economic Impacts of Tuungane: Final Report on the Effects of a Community Driven Reconstruction Program in Eastern Democratic Republic of Congo , 2014 .

[22]  D. Freedman Randomization Does Not Justify Logistic Regression , 2008, 0808.3914.

[23]  Rukmini,et al.  Departnnent of Economics Working Paper Series Pitfalls of Participatory Programs : Evidence from a Randomized Evaluation in Education in India , 2011 .

[24]  J. Kmenta Mostly Harmless Econometrics: An Empiricist's Companion , 2010 .

[25]  Michael A. Clemens,et al.  The Meaning of Failed Replications: A Review and Proposal , 2015, SSRN Electronic Journal.

[26]  David Collier,et al.  Rethinking Social Inquiry: Diverse Tools, Shared Standards , 2004 .

[27]  T. Shakespeare,et al.  Observational Studies , 2003 .

[28]  Evan S. Lieberman,et al.  Does Information Lead to More Active Citizenship? Evidence from an Education Intervention in Rural Kenya , 2013 .

[29]  R. McDermott EXPERIMENTAL METHODS IN POLITICAL SCIENCE , 2003 .

[30]  Brian A. Nosek,et al.  Promoting Transparency in Social Science Research , 2014, Science.

[31]  Jakob Svensson,et al.  Power to the People: Evidence from a Randomized Field Experiment of a Community-Based Monitoring Project in Uganda , 2007 .

[32]  S. Tanner QCA and Causal Inference: A Poor Match for Public Policy Research , 2014 .

[33]  Y. Benjamini,et al.  Controlling the false discovery rate: a practical and powerful approach to multiple testing , 1995 .

[34]  David A. Freedman,et al.  Statistical Models: Theory and Practice: References , 2005 .

[35]  D. Green,et al.  When contact changes minds: An experiment on transmission of support for gay equality , 2014, Science.

[36]  W. Lin,et al.  Agnostic notes on regression adjustments to experimental data: Reexamining Freedman's critique , 2012, 1208.2301.

[37]  Isa Steinmann,et al.  Mastering 'Metrics: The Path from Cause to Effect , 2015 .

[38]  J. Weinstein,et al.  Field Experiments and the Political Economy of Development , 2009 .

[39]  D. Hamermesh,et al.  Replication in Economics , 2007, SSRN Electronic Journal.

[40]  Thad Dunning,et al.  Natural Experiments in the Social Sciences , 2012 .