Characterizing the Use of Images by State-Sponsored Troll Accounts on Twitter

State-sponsored organizations are increasingly linked to efforts aimed to exploit social media for information warfare and manipulating public opinion. Typically, their activities rely on a number of social network accounts they control, aka trolls, that post and interact with other users disguised as "regular" users. These accounts often use images and memes, along with textual content, in order to increase the engagement and the credibility of their posts. In this paper, we present the first study of images shared by state-sponsored accounts by analyzing a ground truth dataset of 1.8M images posted to Twitter by accounts controlled by the Russian Internet Research Agency. First, we analyze the content of the images as well as their posting activity. Then, using Hawkes Processes, we quantify their influence on popular Web communities like Twitter, Reddit, 4chan's Politically Incorrect board (/pol/), and Gab, with respect to the dissemination of images. We find that the extensive image posting activity of Russian trolls coincides with real-world events (e.g., the Unite the Right rally in Charlottesville), and shed light on their targets as well as the content disseminated via images. Finally, we show that the trolls were more effective in disseminating politics-related imagery than other images.

[1]  H. V. Jagadish,et al.  Information warfare and security , 1998, SGMD.

[2]  B. Lewis,et al.  Ethical research standards in a world of big data , 2014, F1000Research.

[3]  Vishal Monga,et al.  Perceptual Image Hashing Via Feature Points: Performance Evaluation and Tradeoffs , 2006, IEEE Transactions on Image Processing.

[4]  Felix Naumann,et al.  Analyzing and predicting viral tweets , 2013, WWW.

[5]  Scott W. Linderman,et al.  Scalable Bayesian Inference for Excitatory Point Process Networks , 2015, 1507.03228.

[6]  Hans-Peter Kriegel,et al.  A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise , 1996, KDD.

[7]  Eric Horvitz,et al.  Analysis of Strategy and Spread of Russia-sponsored Content in the US in 2017 , 2018, ArXiv.

[8]  A. Hawkes Spectra of some self-exciting and mutually exciting point processes , 1971 .

[9]  Katherine L. Milkman,et al.  What Makes Online Content Viral? , 2012 .

[10]  Gianluca Stringhini,et al.  Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and Their Influence on the Web , 2018, WWW.

[11]  Dongwoo Kim,et al.  Tracking the Digital Traces of Russian Trolls: Distinguishing the Roles and Strategy of Trolls On Twitter , 2019, ArXiv.

[12]  Jean-Loup Guillaume,et al.  Fast unfolding of communities in large networks , 2008, 0803.0476.

[13]  Arnaud Legout,et al.  Social Clicks: What and Who Gets Read on Twitter? , 2016, SIGMETRICS.

[14]  Gianluca Stringhini,et al.  Screenshot Classifier annotated images pHashes of non-screenshot annotated images Know Your Meme Generic Annotation Sites Meme Annotation Sites Generic Web Communities , 2018 .

[15]  Oliver Günther,et al.  'STOP SPAMMING ME!' - Exploring Information Overload on Facebook , 2010, AMCIS.

[16]  Greg Rowett,et al.  The strategic need to understand online memes and modern information warfare theory , 2018, 2018 IEEE International Conference on Big Data (Big Data).

[17]  Ryan L. Boyd,et al.  Characterizing the Internet Research Agency’s Social Media Operations During the 2016 U.S. Presidential Election using Linguistic Analyses , 2018 .

[18]  Ulises A. Mejias,et al.  Disinformation and the media: the case of Russia and Ukraine , 2017 .

[19]  L. Stewart,et al.  Examining Trolls and Polarization with a Retweet Network , 2018 .

[20]  Kristina Lerman,et al.  Analyzing the Digital Traces of Political Manipulation: The 2016 Russian Interference Twitter Campaign , 2018, 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM).

[21]  Saiph Savage,et al.  Mobilizing the Trump Train: Understanding Collective Action in a Political Trolling Community , 2018, ICWSM.

[22]  Kristina Lerman,et al.  Who Falls for Online Political Manipulation? , 2018, WWW.

[23]  Scott W. Linderman,et al.  Discovering Latent Network Structure in Point Process Data , 2014, ICML.

[24]  C. Wrzus,et al.  Social network changes and life events across the life span: a meta-analysis. , 2013, Psychological bulletin.

[25]  Eric Gilbert,et al.  Still out there: Modeling and Identifying Russian Troll Accounts on Twitter , 2019, WebSci.

[26]  Savvas Zannettou,et al.  A Quantitative Approach to Understanding Online Antisemitism , 2018, ICWSM.

[27]  Emilio Ferrara,et al.  'Senator, We Sell Ads': Analysis of the 2016 Russian Facebook Ads Campaign , 2018, Advances in Data Science.

[28]  Raffay Hamid,et al.  What makes an image popular? , 2014, WWW.

[29]  Gianluca Stringhini,et al.  Who Let The Trolls Out?: Towards Understanding State-Sponsored Trolls , 2018, WebSci.

[30]  David A. Broniatowski,et al.  Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate , 2018, American journal of public health.

[31]  Gianluca Stringhini,et al.  The web centipede: understanding how web communities influence each other through the lens of mainstream and alternative news sources , 2017, Internet Measurement Conference.

[32]  M. Jacomy,et al.  ForceAtlas2, a Continuous Graph Layout Algorithm for Handy Network Visualization Designed for the Gephi Software , 2014, PloS one.

[33]  Robert Mueller Report On The Investigation Into Russian Interference In The 2016 Presidential Election , 2019 .