"Everything I Disagree With is #FakeNews": Correlating Political Polarization and Spread of Misinformation

An important challenge in the process of tracking and detecting the dissemination of misinformation is to understand the political gap between people that engage with the so called "fake news". A possible factor responsible for this gap is opinion polarization, which may prompt the general public to classify content that they disagree or want to discredit as fake. In this work, we study the relationship between political polarization and content reported by Twitter users as related to "fake news". We investigate how polarization may create distinct narratives on what misinformation actually is. We perform our study based on two datasets collected from Twitter. The first dataset contains tweets about US politics in general, from which we compute the degree of polarization of each user towards the Republican and Democratic Party. In the second dataset, we collect tweets and URLs that co-occurred with "fake news" related keywords and hashtags, such as #FakeNews and #AlternativeFact, as well as reactions towards such tweets and URLs. We then analyze the relationship between polarization and what is perceived as misinformation, and whether users are designating information that they disagree as fake. Our results show an increase in the polarization of users and URLs associated with fake-news keywords and hashtags, when compared to information not labeled as "fake news". We discuss the impact of our findings on the challenges of tracking "fake news" in the ongoing battle against misinformation.

[1]  Filippo Menczer,et al.  The rise of social bots , 2014, Commun. ACM.

[2]  Kyomin Jung,et al.  Prominent Features of Rumor Propagation in Online Social Media , 2013, 2013 IEEE 13th International Conference on Data Mining.

[3]  Wagner Meira,et al.  Antagonism Also Flows Through Retweets: The Impact of Out-of-Context Quotes in Opinion Polarization Analysis , 2017, ICWSM.

[4]  Jiajia Wang,et al.  2SI2R rumor spreading model in homogeneous networks , 2014 .

[5]  Christos Faloutsos,et al.  Random walk with restart: fast solutions and applications , 2008, Knowledge and Information Systems.

[6]  Eli Pariser,et al.  The Filter Bubble: What the Internet Is Hiding from You , 2011 .

[7]  P. Howard,et al.  Bots, #StrongerIn, and #Brexit: Computational Propaganda during the UK-EU Referendum , 2016, ArXiv.

[8]  Jacob Ratkiewicz,et al.  Detecting and Tracking Political Abuse in Social Media , 2011, ICWSM.

[9]  C. Sunstein The Law of Group Polarization , 1999, How Change Happens.

[10]  Yimin Chen,et al.  Automatic deception detection: Methods for finding fake news , 2015, ASIST.

[11]  David Lazer,et al.  The rise of the social algorithm , 2015, Science.

[12]  Niloy Ganguly,et al.  Stop Clickbait: Detecting and preventing clickbaits in online news media , 2016, 2016 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM).

[13]  Jacob Ratkiewicz,et al.  Political Polarization on Twitter , 2011, ICWSM.

[14]  Giulia Boato,et al.  Towards the verification of image integrity in online news , 2015, 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).

[15]  G. Caldarelli,et al.  The spreading of misinformation online , 2016, Proceedings of the National Academy of Sciences.

[16]  Virgílio A. F. Almeida,et al.  From bias to opinion: a transfer-learning approach to real-time sentiment analysis , 2011, KDD.

[17]  Filippo Menczer,et al.  Fact-checking Effect on Viral Hoaxes: A Model of Misinformation Spread in Social Networks , 2015, WWW.

[18]  Wai-Tat Fu,et al.  #Snowden: Understanding Biases Introduced by Behavioral Differences of Opinion Groups on Social Media , 2016, CHI.

[19]  Mung Chiang,et al.  Quantifying Political Leaning from Tweets and Retweets , 2013, ICWSM.

[20]  Wai-Tat Fu,et al.  Beyond the filter bubble: interactive effects of perceived threat and topic involvement on selective exposure to information , 2013, CHI.

[21]  D. Isenberg Group polarization: A critical review and meta-analysis. , 1986 .

[22]  Kristina Lerman,et al.  Information Contagion: An Empirical Study of the Spread of News on Digg and Twitter Social Networks , 2010, ICWSM.

[23]  Mauro Conti,et al.  It's always April fools' day!: On the difficulty of social network misinformation classification via propagation features , 2017, 2017 IEEE Workshop on Information Forensics and Security (WIFS).

[24]  Jon M. Kleinberg,et al.  Community membership identification from small seed sets , 2014, KDD.

[25]  Lada A. Adamic,et al.  Exposure to ideologically diverse news and opinion on Facebook , 2015, Science.

[26]  Kate Starbird,et al.  Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter , 2017, ICWSM.

[27]  Regina M. Marchi With Facebook, Blogs, and Fake News, Teens Reject Journalistic “Objectivity” , 2012 .

[28]  M. Gentzkow,et al.  Social Media and Fake News in the 2016 Election , 2017 .

[29]  Ullrich K. H. Ecker,et al.  Misinformation and Its Correction , 2012, Psychological science in the public interest : a journal of the American Psychological Society.

[30]  Philip N. Howard,et al.  Political Bots and the Manipulation of Public Opinion in Venezuela , 2015, ArXiv.

[31]  Johan Bollen,et al.  Computational Fact Checking from Knowledge Networks , 2015, PloS one.

[32]  Guido Caldarelli,et al.  Emotional Dynamics in the Age of Misinformation , 2015, PloS one.

[33]  Samuel C. Woolley,et al.  Automating power: Social bot interference in global politics , 2016, First Monday.

[34]  Divyakant Agrawal,et al.  Limiting the spread of misinformation in social networks , 2011, WWW.

[35]  John H. Evans,et al.  Have American's Social Attitudes Become More Polarized? , 1996, American Journal of Sociology.