An Agenda for Open Science in Communication

In the last 10 years, many canonical findings in the social sciences appear unreliable. This so-called “replication crisis” has spurred calls for open science practices, which aim to increase the reproducibility, replicability, and generalizability of findings. Communication research is subject to many of the same challenges that have caused low replicability in other fields. As a result, we propose an agenda for adopting open science practices in Communication, which includes the following seven suggestions: (1) publish materials, data, and code; (2) preregister studies and submit registered reports; (3) conduct replications; (4) collaborate; (5) foster open science skills; (6) implement Transparency and Openness Promotion Guidelines; and (7) incentivize open science practices. Although in our agenda we focus mostly on quantitative research, we also reflect on open science practices relevant to qualitative research. We conclude by discussing potential objections and concerns associated with open science practices.

[1]  José V. Hernández-Conde,et al.  Estimating the Reproducibility of Experimental Philosophy , 2018, Review of Philosophy and Psychology.

[2]  H. Machácková,et al.  Open Science and the Science-Society Relationship , 2019, Society.

[3]  R. Merton,et al.  The Sociology of Science: Theoretical and Empirical Investigations , 1975, Journal for the Scientific Study of Religion.

[4]  Reginald B. Adams,et al.  Investigating Variation in Replicability: A “Many Labs” Replication Project , 2014 .

[5]  F. Kern,et al.  Preregistration for Qualitative Research Template , 2018 .

[6]  I. Vermeulen,et al.  Questionable Research and Publication Practices in Communication Science , 2015 .

[7]  Luc Rocher,et al.  Estimating the success of re-identifications in incomplete datasets using generative models , 2019, Nature Communications.

[8]  Stephen A. Rains,et al.  Sixty years of quantitative communication research summarized: lessons from 149 meta-analyses , 2018 .

[9]  Carl T. Bergstrom,et al.  Publication bias and the canonization of false facts , 2016, eLife.

[10]  Reginald B. Adams,et al.  Many Labs 2: Investigating Variation in Replicability Across Sample and Setting , 2018 .

[11]  Brian A. Nosek,et al.  Recommendations for Increasing Replicability in Psychology † , 2013 .

[12]  Jacob Cohen,et al.  A power primer. , 1992, Psychological bulletin.

[13]  Brandon Van Der Heide,et al.  Too Much of a Good Thing? The Relationship Between Number of Friends and Interpersonal Impressions on Facebook , 2008, J. Comput. Mediat. Commun..

[14]  T. Haven,et al.  Preregistering qualitative research , 2019 .

[15]  S. Olson The Science of Science Communication III: Inspiring Novel Collaborations and Building Capacity: Proceedings of a Colloquium , 2018 .

[16]  Julie McLeod,et al.  Opening research data: issues and opportunities , 2014 .

[17]  David Westerman,et al.  On Replication in Communication Science , 2018 .

[18]  Thomas Schäfer,et al.  The Meaningfulness of Effect Sizes in Psychological Research: Differences Between Sub-Disciplines and the Impact of Potential Biases , 2019, Front. Psychol..

[19]  Brianna L. Lane,et al.  Still Too Much of a Good Thing? The Replication of Tong, Van Der Heide, Langwell, and Walther (2008) , 2018 .

[20]  G. Loewenstein,et al.  Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling , 2012, Psychological science.

[21]  Leif D. Nelson,et al.  Let's Publish Fewer Papers , 2012 .

[22]  E. Wagenmakers,et al.  Why psychologists must change the way they analyze their data: the case of psi: comment on Bem (2011). , 2011, Journal of personality and social psychology.

[23]  Andrew K. Przybylski,et al.  Social media’s enduring effect on adolescent life satisfaction , 2019, Proceedings of the National Academy of Sciences.

[24]  David M. Keating,et al.  We do publish (conceptual) replications (sometimes): publication trends in communication science, 2007–2016 , 2019, Annals of the International Communication Association.

[25]  G. Banks,et al.  The Chrysalis Effect , 2017 .

[26]  Chris Chambers,et al.  What next for registered reports , 2018, Nature Human Behaviour.

[27]  F. Tuerlinckx,et al.  Preregistration: Comparing Dream to Reality , 2019 .

[28]  C. Chambers Registered Reports: A new publishing initiative at Cortex , 2013, Cortex.

[29]  T. Levine,et al.  A Critical Assessment of Null Hypothesis Significance Testing in Quantitative Communication Research , 2008 .

[30]  Neil Malhotra,et al.  Publication bias in the social sciences: Unlocking the file drawer , 2014, Science.

[31]  H. Beek F1000Prime recommendation of False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. , 2012 .

[32]  Dorothy Bishop Rein in the four horsemen of irreproducibility , 2019, Nature.

[33]  Richard Kunert Internal conceptual replications do not increase independent replication success , 2016, Psychonomic Bulletin & Review.

[34]  Jeffrey R. Spies,et al.  The Replication Recipe: What Makes for a Convincing Replication? , 2014 .

[35]  F. Arnaud,et al.  From core referencing to data re-use: two French national initiatives to reinforce paleodata stewardship (National Cyber Core Repository and LTER France Retro-Observatory) , 2017 .

[36]  Dianne Easterling,et al.  March , 1890, The Hospital.

[37]  A. Gelman,et al.  The garden of forking paths : Why multiple comparisons can be a problem , even when there is no “ fishing expedition ” or “ p-hacking ” and the research hypothesis was posited ahead of time ∗ , 2019 .

[38]  John P. A. Ioannidis,et al.  A manifesto for reproducible science , 2017, Nature Human Behaviour.

[39]  Susann Fiedler,et al.  Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency , 2016, PLoS biology.

[40]  Nicholas David Bowman,et al.  A Layered Framework for Considering Open Science Practices , 2018, Communication Research Reports.

[41]  David C. Funder,et al.  Evaluating Effect Size in Psychological Research: Sense and Nonsense , 2019, Advances in Methods and Practices in Psychological Science.

[42]  Harsh Taneja Using Commercial Audience Measurement Data in Academic Research , 2016 .

[43]  Etienne P. LeBel,et al.  A Unified Framework to Quantify the Credibility of Scientific Findings , 2018, Advances in Methods and Practices in Psychological Science.

[44]  John W. Creswell,et al.  Qualitative Inquiry and Research Design: Choosing Among Five Approaches , 1966 .

[45]  D. Mehler,et al.  Open science challenges, benefits and tips in early career and beyond , 2018, PLoS biology.

[46]  Nicole Ruggiano,et al.  Conducting secondary analysis of qualitative data: Should we, can we, and how? , 2019, Qualitative social work : QSW : research and practice.

[47]  Brian A. Nosek,et al.  Promoting an open research culture , 2015, Science.

[48]  R. L. Holbert Editorial Vision, Goals, Processes, and Procedures , 2019, Journal of Communication.

[49]  Neil A. Lewis Open Communication Science: A Primer on Why and Some Recommendations for How , 2019, Communication Methods and Measures.

[50]  R. Giner-Sorolla,et al.  Science or Art? How Aesthetic Standards Grease the Way Through the Publication Bottleneck but Undermine Science , 2012, Perspectives on psychological science : a journal of the Association for Psychological Science.

[51]  H. Boomgaarden,et al.  Start Spreading the News: A Comparative Experiment on the Effects of Populist Communication on Political Engagement in Sixteen European Countries , 2018, The international journal of press/politics.

[52]  Daniele Fanelli,et al.  Negative results are disappearing from most disciplines and countries , 2011, Scientometrics.

[53]  Brian A. Nosek,et al.  The preregistration revolution , 2018, Proceedings of the National Academy of Sciences.

[54]  Brian A. Nosek,et al.  How open science helps researchers succeed , 2016, eLife.

[55]  Damian Trilling,et al.  Computational Communication Science| Toward Open Computational Communication Science: A Practical Road Map for Reusable Data and Code , 2019 .

[56]  Cary Funk,et al.  Trust and mistrust in Americans’ views of scientific experts , 2019 .

[57]  W. Vanpaemel,et al.  Are We Wasting a Good Crisis? The Availability of Psychological Research Data after the Storm , 2015 .

[58]  Gary James Jason,et al.  The Logic of Scientific Discovery , 1988 .

[59]  Michèle B. Nuijten,et al.  The prevalence of statistical reporting errors in psychology (1985–2013) , 2015, Behavior Research Methods.

[60]  Michael C. Frank,et al.  Estimating the reproducibility of psychological science , 2015, Science.

[61]  Franziska Marquart,et al.  Questionable Research Practices in Experimental Communication Research: A Systematic Analysis From 1980 to 2013 , 2015 .

[62]  Gideon Nave,et al.  Evaluating replicability of laboratory experiments in economics , 2016, Science.

[63]  Brian A. Nosek,et al.  Publication and other reporting biases in cognitive sciences: detection, prevalence, and prevention , 2014, Trends in Cognitive Sciences.

[64]  Robert D. McIntosh,et al.  Exploratory reports: A new article type for Cortex , 2017, Cortex.

[65]  H. Cooper,et al.  Finding the Missing Science: The Fate of Studies Submitted for Review by a Human Subjects Committee , 1997 .

[66]  Brian A. Nosek,et al.  Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015 , 2018, Nature Human Behaviour.

[67]  K. DeWalt,et al.  Participant Observation: A Guide for Fieldworkers , 2001 .

[68]  N. Kerr HARKing: Hypothesizing After the Results are Known , 1998, Personality and social psychology review : an official journal of the Society for Personality and Social Psychology, Inc.

[69]  Norman Kaplan,et al.  The Sociology of Science: Theoretical and Empirical Investigations , 1974 .

[70]  Michael C. Frank,et al.  A practical guide for transparency in psychological science , 2018 .

[71]  Amy C. Orben A journal club to fix science , 2019, Nature.

[72]  Leif D. Nelson,et al.  False-Positive Psychology , 2011, Psychological science.

[73]  Philip E. Bourne,et al.  Preprints for the life sciences , 2016, Science.

[74]  King-Wa Fu,et al.  The Relationship Between Interdisciplinarity and Journal Impact Factor in the Field of Communication During 1997–2016 , 2019, Journal of Communication.

[75]  R. Rosenthal The file drawer problem and tolerance for null results , 1979 .

[76]  Directory of free, open psychological datasets , 2019 .

[77]  Felix D. Schönbrodt,et al.  Attitudes Toward Open Science and Public Data Sharing , 2019, Social Psychology.