Susceptibility to misinformation is consistent across question framings and response modes and better explained by myside bias and partisanship than analytical thinking

Misinformation presents a significant societal problem. To measure individuals’ susceptibility to misinformation and study its predictors, researchers have used a broad variety of ad-hoc item sets, scales, question framings, and response modes. Because of this variety, it remains unknown whether results from different studies can be compared (e.g., in meta-analyses). In this preregistered study (US sample; N = 2,622), we compare five commonly used question framings (eliciting perceived headline accuracy, manipulativeness, reliability, trustworthiness, and whether a headline is real or fake) and three response modes (binary, 6-point and 7-point scales), using the psychometrically validated Misinformation Susceptibility Test (MIST). We test 1) whether different question framings and response modes yield similar responses for the same item set, 2) whether people’s confidence in their primary judgments is affected by question framings and response modes, and 3) which key psychological factors (myside bias, political partisanship, cognitive reflection, and numeracy skills) best predict misinformation susceptibility across assessment methods. Different response modes and question framings yield similar (but not identical) responses for both primary ratings and confidence judgments. We also find a similar nomological net across conditions, suggesting cross-study comparability. Finally, myside bias and political conservatism were strongly positively correlated with misinformation susceptibility, whereas numeracy skills and especially cognitive reflection were less important (although we note potential ceiling effects for numeracy). We thus find more support for an “integrative” account than a “classical reasoning” account of misinformation belief.

[1]  S. Lewandowsky,et al.  Psychological inoculation improves resilience against misinformation on social media , 2022, Science advances.

[2]  S. van der Linden Misinformation: susceptibility, spread, and interventions to immunize the public , 2022, Nature Medicine.

[3]  J. Baron,et al.  Actively open-minded thinking and the political effects of its absence , 2022 .

[4]  David G. Rand,et al.  Does Analytic Thinking Insulate Against Pro-Kremlin Disinformation? Evidence from Ukraine , 2021 .

[5]  S. Stieger,et al.  The Misinformation Susceptibility Test (MIST): A psychometrically validated measure of news veracity discernment. , 2021, Behavior research methods.

[6]  S. van der Linden,et al.  How Accurate Are Accuracy-Nudge Interventions? A Preregistered Direct Replication of Pennycook et al. (2020) , 2021, Psychological science.

[7]  Bertram Gawronski Partisan bias in the identification of fake news , 2021, Trends in Cognitive Sciences.

[8]  S. van der Linden,et al.  How Can Psychological Science Help Counter the Spread of Fake News? , 2021, The Spanish Journal of Psychology.

[9]  H. Larson,et al.  Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA , 2021, Nature Human Behaviour.

[10]  S. van der Linden,et al.  Active inoculation boosts attitudinal resistance against extremist persuasion techniques: a novel approach towards the prevention of violent extremism , 2021, Behavioural Public Policy.

[11]  S. Linden,et al.  Towards psychological herd immunity: Cross-cultural evidence for two prebunking interventions against COVID-19 misinformation , 2021, Big Data Soc..

[12]  Natalie C. Ebner,et al.  The role of analytical reasoning and source credibility on the evaluation of real and fake full-length news articles , 2020, Cognitive Research: Principles and Implications.

[13]  David G. Rand,et al.  The Psychology of Fake News , 2021, Trends in Cognitive Sciences.

[14]  David G. Rand,et al.  Shifting attention to accuracy can reduce misinformation online , 2021, Nature.

[15]  S. van der Linden,et al.  Disentangling Item and Testing Effects in Inoculation Research on Online Misinformation: Solomon Revisited , 2020, Educational and psychological measurement.

[16]  D. Rapp,et al.  Misinformed and unaware? Metacognition and the influence of inaccurate information. , 2020, Journal of experimental psychology. Learning, memory, and cognition.

[17]  S. van der Linden,et al.  Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments. , 2020, Journal of experimental psychology. Applied.

[18]  Jon Roozenbeek,et al.  Susceptibility to misinformation about COVID-19 around the world , 2020, Royal Society Open Science.

[19]  Joshua A. Tucker,et al.  Political Psychology in the Digital (mis)Information age: A Model of News Belief and Sharing , 2020, Social Issues and Policy Review.

[20]  S. Linden,et al.  Psychological Inoculation Against Fake News , 2020, The Psychology of Fake News.

[21]  S. Linden,et al.  Disentangling Item and Testing Effects in Inoculation Research on Online Misinformation: Solomon Revisited: , 2020 .

[22]  Benjamin A. Lyons,et al.  A digital media literacy intervention increases discernment between mainstream and false news in the United States and India , 2020, Proceedings of the National Academy of Sciences.

[23]  D. Spiegelhalter,et al.  Risk perceptions of COVID-19 around the world , 2020, Journal of Risk Research.

[24]  David G. Rand,et al.  Using social and behavioural science to support COVID-19 pandemic response , 2020, Nature Human Behaviour.

[25]  David G. Rand,et al.  Fighting COVID-19 Misinformation on Social Media: Experimental Evidence for a Scalable Accuracy-Nudge Intervention , 2020, Psychological science.

[26]  S. van der Linden,et al.  You are fake news: political bias in perceptions of fake news , 2020 .

[27]  Lisa K. Fazio Pausing to consider why a headline is true or false can help reduce the sharing of false news , 2020, Harvard Kennedy School Misinformation Review.

[28]  Sarah McGrew,et al.  Learning to evaluate: An intervention in civic online reasoning , 2020, Comput. Educ..

[29]  S. van der Linden,et al.  Good News about Bad News: Gamified Inoculation Boosts Confidence and Cognitive Immunity Against Fake News , 2020, Journal of cognition.

[30]  David G. Rand,et al.  Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. , 2020, Journal of personality.

[31]  J. Baron Actively open-minded thinking in politics , 2019, Cognition.

[32]  David G. Rand,et al.  Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning , 2019, Cognition.

[33]  Jon Roozenbeek,et al.  Fake news game confers psychological resistance against online misinformation , 2019, Palgrave Communications.

[34]  Gordon Pennycook On the belief that beliefs should change according to evidence: Implications for conspiratorial, moral, paranormal, political, religious, and science beliefs , 2019, Judgment and Decision Making.

[35]  D. Watson,et al.  Constructing validity: New developments in creating objective measuring instruments. , 2019, Psychological assessment.

[36]  Rita R. Silva,et al.  Truth by Repetition: Explanations and Implications , 2019, Current Directions in Psychological Science.

[37]  S. Nesterov,et al.  How accurate is the accuracy? , 2018, Journal of Nuclear Cardiology.

[38]  Sean T. H. Lee Testing for Measurement Invariance: Does your measure mean the same thing for different participants? , 2018 .

[39]  Sera L. Young,et al.  Best Practices for Developing and Validating Scales for Health, Social, and Behavioral Research: A Primer , 2018, Front. Public Health.

[40]  David G. Rand,et al.  Prior Exposure Increases Perceived Accuracy of Fake News , 2018, Journal of experimental psychology. General.

[41]  Jay J. Van Bavel,et al.  The Partisan Brain: An Identity-Based Model of Political Belief , 2018, Trends in Cognitive Sciences.

[42]  Annika M. Svedholm-Häkkinen,et al.  Actively open-minded thinking: development of a shortened scale and disentangling attitudes towards knowledge and people , 2018 .

[43]  Anna DeCastellarnau A classification of response scale characteristics that affect data quality: a literature review , 2017, Quality & quantity.

[44]  Joshua A. Tucker,et al.  Emotion shapes the diffusion of moralized content in social networks , 2017, Proceedings of the National Academy of Sciences.

[45]  Adam J. Berinsky,et al.  Processing political misinformation: comprehending the Trump phenomenon , 2017, Royal Society Open Science.

[46]  T. Tracey,et al.  Use of multi-group confirmatory factor analysis in examining measurement invariance in counseling psychology research , 2017 .

[47]  A. Acquisti,et al.  Beyond the Turk: Alternative Platforms for Crowdsourcing Behavioral Research , 2016 .

[48]  Daniel M. Oppenheimer,et al.  Investigating an alternate form of the cognitive reflection test , 2016, Judgment and Decision Making.

[49]  James P. Stevens,et al.  Applied Multivariate Statistics for the Social Sciences : Analyses with SAS and IBM’s SPSS, Sixth Edition , 2015 .

[50]  Sydney E. Scott,et al.  Why does the Cognitive Reflection Test (sometimes) predict utilitarian moral judgment (and other things) , 2015 .

[51]  Willem E. Saris,et al.  Choosing the Number of Categories in Agree–Disagree Scales , 2014 .

[52]  Derek W. Meeks,et al.  Physicians' diagnostic accuracy, confidence, and resource requests: a vignette study. , 2013, JAMA internal medicine.

[53]  K. Stanovich,et al.  Myside Bias, Rational Thinking, and Intelligence , 2013 .

[54]  Robert J. MacCoun,et al.  The benefits of knowing what you know (and what you don’t): How calibration affects credibility , 2008 .

[55]  E. Berner,et al.  Overconfidence as a cause of diagnostic error in medicine. , 2008, The American journal of medicine.

[56]  G. Kalton Question‐Wording Effects in Surveys , 2006 .

[57]  S. Frederick Journal of Economic Perspectives—Volume 19, Number 4—Fall 2005—Pages 25–42 Cognitive Reflection and Decision Making , 2022 .

[58]  Dianne M. Finkelstein,et al.  A Beginner's Guide to Structural Equation Modeling , 2005, Technometrics.

[59]  Michael Pfau,et al.  Inoculation Theory of Resistance to Influence at Maturity: Recent Progress In Theory Development and Application and Suggestions for Future Research , 2005 .

[60]  Eric R. Stone,et al.  Intuitive evaluation of likelihood judgment producers: evidence for a confidence heuristic , 2004 .

[61]  Richard E. Petty,et al.  Source Credibility and Attitude Certainty: A Metacognitive Analysis of Resistance to Persuasion , 2004 .

[62]  N. C. Silver,et al.  A Monte Carlo Evaluation of Tests for Comparing Dependent Correlations , 2003, The Journal of general psychology.

[63]  A. Colman,et al.  Optimal number of response categories in rating scales: reliability, validity, discriminating power, and respondent preferences. , 2000, Acta psychologica.

[64]  Stephanie Lee Sargent,et al.  Effects of Photographs in News-Magazine Reports on issue Perception , 1999 .

[65]  R. P. McDonald,et al.  Test Theory: A Unified Treatment , 1999 .

[66]  N. Schwarz Self-reports: How the questions shape the answers. , 1999 .

[67]  P. Bentler,et al.  Cutoff criteria for fit indexes in covariance structure analysis : Conventional criteria versus new alternatives , 1999 .

[68]  S. West,et al.  The investigation of personality structure: Statistical models , 1997 .

[69]  Lisa M. Schwartz,et al.  The Role of Numeracy in Understanding the Benefit of Screening Mammography , 1997, Annals of Internal Medicine.

[70]  Tom W. Smith THE HOLOCAUST DENIAL CONTROVERSY , 1995 .

[71]  F. M. Andrews Construct Validity and Error Components of Survey Measures: A Structural Modeling Approach , 1984 .

[72]  J. Keats,et al.  Test theory. , 1967, Annual review of psychology.