Meta-assessment of bias in science

Significance Science is said to be suffering a reproducibility crisis caused by many biases. How common are these problems, across the wide diversity of research fields? We probed for multiple bias-related patterns in a large random sample of meta-analyses taken from all disciplines. The magnitude of these biases varied widely across fields and was on average relatively small. However, we consistently observed that small, early, highly cited studies published in peer-reviewed journals were likely to overestimate effects. We found little evidence that these biases were related to scientific productivity, and we found no difference between biases in male and female researchers. However, a scientist’s early-career status, isolation, and lack of scientific integrity might be significant risk factors for producing unreliable results. Numerous biases are believed to affect the scientific literature, but their actual prevalence across disciplines is unknown. To gain a comprehensive picture of the potential imprint of bias in science, we probed for the most commonly postulated bias-related patterns and risk factors, in a large random sample of meta-analyses taken from all disciplines. The magnitude of these biases varied widely across fields and was overall relatively small. However, we consistently observed a significant risk of small, early, and highly cited studies to overestimate effects and of studies not published in peer-reviewed journals to underestimate them. We also found at least partial confirmation of previous evidence suggesting that US studies and early studies might report more extreme effects, although these effects were smaller and more heterogeneously distributed across meta-analyses and disciplines. Authors publishing at high rates and receiving many citations were, overall, not at greater risk of bias. However, effect sizes were likely to be overestimated by early-career researchers, those working in small or long-distance collaborations, and those responsible for scientific misconduct, supporting hypotheses that connect bias to situational factors, lack of mutual control, and individual integrity. Some of these patterns and risk factors might have modestly increased in intensity over time, particularly in the social sciences. Our findings suggest that, besides one being routinely cautious that published small, highly-cited, and earlier studies may yield inflated results, the feasibility and costs of interventions to attenuate biases in the literature might need to be discussed on a discipline-specific and topic-specific basis.

[1]  C. Hayer,et al.  Pressures to Publish: Catalysts for the Loss of Scientific Writing Integrity? , 2013 .

[2]  Jane Qiu,et al.  Publish or perish in China , 2010, Nature.

[3]  J. Schooler Unpublished results hide the decline effect , 2011, Nature.

[4]  S D Walter,et al.  A comparison of methods to detect publication bias in meta‐analysis , 2001, Statistics in medicine.

[5]  Shane Connelly,et al.  Personality and Ethical Decision-Making in Research: The Role of Perceptions of Self and others , 2007, Journal of empirical research on human research ethics : JERHRE.

[6]  J. Ioannidis,et al.  US studies may overestimate effect sizes in softer research , 2013, Proceedings of the National Academy of Sciences.

[7]  D. Fanelli “Positive” Results Increase Down the Hierarchy of the Sciences , 2010, PloS one.

[8]  Daniele Fanelli,et al.  Negative results are disappearing from most disciplines and countries , 2011, Scientometrics.

[9]  V. Larivière,et al.  Researchers’ Individual Publication Rate Has Not Increased in a Century , 2016, PloS one.

[10]  H. Dubben,et al.  Systematic review of publication bias in studies on publication bias , 2005, BMJ : British Medical Journal.

[11]  F. Song,et al.  Dissemination and publication of research findings: an updated review of related biases. , 2010, Health technology assessment.

[12]  L. Zirulia,et al.  The Economics of Scientific Misconduct , 2008 .

[13]  John P A Ioannidis,et al.  Interpretation of tests of heterogeneity and bias in meta-analysis. , 2008, Journal of evaluation in clinical practice.

[14]  S. Goodman,et al.  Meta-research: Evaluation and Improvement of Research Methods and Practices , 2015, PLoS biology.

[15]  Nees Jan van Eck,et al.  Large scale author name disambiguation using rule-based scoring and clustering , 2014 .

[16]  A. Schrank,et al.  Incubating Innovation or Cultivating Corruption?: The Developmental State and the Life Sciences in Asia , 2010 .

[17]  Mark S. Davis,et al.  Narcissism, Entitlement, and Questionable Research Practices in Counseling: A Pilot Study , 2008 .

[18]  Melissa S. Anderson,et al.  Institutions' Expectations for Researchers' Self-Funding, Federal Grant Holding, and Private Industry Involvement: Manifold Drivers of Self-Interest and Researcher Behavior , 2009, Academic medicine : journal of the Association of American Medical Colleges.

[19]  Paul C. Johnson Extension of Nakagawa & Schielzeth's R2GLMM to random slopes models , 2014, Methods in ecology and evolution.

[20]  Marco Pautasso,et al.  Worsening file-drawer problem in the abstracts of natural, medical and social science databases , 2010, Scientometrics.

[21]  J. Sterne,et al.  Publication and related bias in meta-analysis: power of statistical tests and prevalence in the literature. , 2000, Journal of clinical epidemiology.

[22]  B. Djulbegovic,et al.  Pharmaceutical industry sponsorship and research outcome and quality: systematic review , 2003, BMJ : British Medical Journal.

[23]  J. Ioannidis,et al.  Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias , 2008, PloS one.

[24]  Hendrik P. van Dalen,et al.  Intended and Unintended Consequences of a Publish-or-Perish Culture: A Worldwide Survey , 2012, J. Assoc. Inf. Sci. Technol..

[25]  J. Ioannidis,et al.  Why Current Publication Practices May Distort Science , 2008, PLoS medicine.

[26]  Melissa S. Anderson,et al.  Normal Misbehavior: Scientists Talk about the Ethics of Research , 2006, Journal of empirical research on human research ethics : JERHRE.

[27]  Thed N. van Leeuwen,et al.  Towards a new crown indicator: Some theoretical considerations , 2010, J. Informetrics.

[28]  A. Kaatz,et al.  Are Men More Likely than Women To Commit Scientific Misconduct? Maybe, Maybe Not , 2013, mBio.

[29]  Thed N. van Leeuwen,et al.  Benchmarking university-industry research cooperation worldwide: performance measurements and indicators based on co-authorship data for the world's largest universities , 2009 .

[30]  J. Ioannidis,et al.  Science mapping analysis characterizes 235 biases in biomedical research. , 2010, Journal of clinical epidemiology.

[31]  A. Casadevall,et al.  Males Are Overrepresented among Life Science Researchers Committing Scientific Misconduct , 2013, mBio.

[32]  Wolfgang Glänzel,et al.  Bibliometric Evidence for a Hierarchy of the Sciences , 2013, PloS one.

[33]  S. Sharp,et al.  Explaining heterogeneity in meta-analysis: a comparison of methods. , 1997, Statistics in medicine.

[34]  Ronald D. Vale,et al.  Accelerating scientific publication in biology , 2015, Proceedings of the National Academy of Sciences.

[35]  Lisa Hartling,et al.  Systematic review data extraction: cross-sectional study showed that experience did not increase accuracy. , 2010, Journal of clinical epidemiology.

[36]  Y. Smulders,et al.  Publication Pressure and Scientific Misconduct in Medical Scientists , 2014, Journal of empirical research on human research ethics : JERHRE.

[37]  Dimitra Dodou,et al.  A surge of p-values between 0.041 and 0.049 in recent decades (but negative results are increasing rapidly too) , 2015, PeerJ.

[38]  D. Fanelli Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data , 2010, PloS one.

[39]  N. Gilbert Streamlined chemical tests rebuffed , 2010, Nature.

[40]  J. Ioannidis,et al.  Reply to Nuijten et al.: Reanalyses actually confirm that US studies overestimate effects in softer research , 2014, Proceedings of the National Academy of Sciences.

[41]  D. Bates,et al.  Fitting Linear Mixed-Effects Models Using lme4 , 2014, 1406.5823.

[42]  C. Bailey Psychopathy, Academic Accountants' Attitudes toward Unethical Research Practices, and Publication Success , 2015 .

[43]  John P. A. Ioannidis,et al.  Reproducible Research Practices and Transparency across the Biomedical Literature , 2016, PLoS biology.

[44]  J Trevelyan,et al.  East meets West. , 1991, Nursing times.

[45]  Jay J Van Bavel,et al.  Contextual sensitivity in scientific reproducibility , 2016, Proceedings of the National Academy of Sciences.

[46]  Iztok Hozo,et al.  When Should Potentially False Research Findings Be Considered Acceptable? , 2007, PLoS medicine.

[47]  Thomas A Trikalinos,et al.  Early extreme contradictory estimates may appear in published research: the Proteus phenomenon in molecular genetics research and randomized trials. , 2005, Journal of clinical epidemiology.

[48]  Theodor D. Sterling,et al.  Publication decisions revisited: the effect of the outcome of statistical tests on the decision to p , 1995 .

[49]  V. Champion,et al.  Cancer Patients' Attitudes toward Future Research Uses of Stored Human Biological Materials , 2007, Journal of empirical research on human research ethics : JERHRE.

[50]  B. Olsen,et al.  Editorial: Journal of Negative Results in Biomedicine , 2002, Journal of Negative Results in BioMedicine.

[51]  T. Perneger,et al.  Citation bias favoring statistically significant studies was present in medical research. , 2013, Journal of clinical epidemiology.

[52]  Vincent Larivière,et al.  Misconduct Policies, Academic Culture and Career Stage, Not Gender or Pressures to Publish, Affect Scientific Integrity , 2015, PloS one.