Did State-sponsored Trolls Shape the US Presidential Election Discourse? Quantifying Influence on Twitter

It is a widely accepted fact that state-sponsored Twitter accounts operated during the 2016 US presidential election, spreading millions of tweets with misinformation and inflammatory political content. Whether these social media campaigns of the so-called "troll" accounts were able to manipulate public opinion is still in question. Here, we aim to quantify the influence of troll accounts on Twitter by analyzing 152.5 million tweets from 9.9 million users, including 822 troll accounts. The data collected during the US election campaign, contain original troll tweets. From these data, we constructed a very large interaction graph; a directed graph of 9.3 million nodes and 169.9 million edges. Recently, Twitter released datasets on the misinformation campaigns of 8,275 state-sponsored accounts linked to Russia, Iran and Venezuela. These data serve as a ground-truth identifier of troll users in our dataset. Using graph analysis techniques along with a game-theoretic centrality measure, we quantify the influence of all Twitter accounts (authentic users and trolls) on the overall information exchange as is defined by the retweet cascades. Then, we provide a global influence ranking of all Twitter accounts and we find that only four troll accounts appear in the top-1000 and only one in the top-100. This along with other findings presents evidence that the authentic users were the driving force of virality and influence in the network.

[1]  Y. Narahari,et al.  Determining the top-k nodes in social networks using the Shapley value , 2008, AAMAS.

[2]  Duncan J. Watts,et al.  The Structural Virality of Online Diffusion , 2015, Manag. Sci..

[3]  Michael J. Mazarr,et al.  Hostile Social Manipulation: Present Realities and Emerging Trends , 2019 .

[4]  Lynn Vavreck,et al.  Identity Crisis , 2018, СОВРЕМЕННАЯ АРХИТЕКТУРА МИРА.

[5]  Soroush Vosoughi,et al.  Rumor Gauge , 2017, ACM Trans. Knowl. Discov. Data.

[6]  Hernán A. Makse,et al.  CUNY Academic Works , 2022 .

[7]  Gianluca Stringhini,et al.  Who Let The Trolls Out?: Towards Understanding State-Sponsored Trolls , 2018, WebSci.

[8]  Filippo Menczer,et al.  Scalable and Generalizable Social Bot Detection through Data Selection , 2019, AAAI.

[9]  H. Roberts,et al.  Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics , 2018 .

[10]  Flaviano Morone,et al.  Validation of Twitter opinion trends with national polling aggregates: Hillary Clinton vs Donald Trump , 2016, Scientific Reports.

[11]  Jon Kleinberg,et al.  Maximizing the spread of influence through a social network , 2003, KDD '03.

[12]  Lev Muchnik,et al.  Identifying influential spreaders in complex networks , 2010, 1001.5285.

[13]  Hernán A. Makse,et al.  Influence maximization in complex networks through optimal percolation , 2015, Nature.

[14]  L. Shapley A Value for n-person Games , 1988 .

[15]  Tomasz P. Michalak,et al.  How good is the Shapley value-based approach to the influence maximization problem? , 2014, ECAI.

[16]  Kristina Lerman,et al.  Who Falls for Online Political Manipulation? , 2018, WWW.

[17]  Nicholas R. Jennings,et al.  Efficient Computation of the Shapley Value for Centrality in Networks , 2010, WINE.

[18]  Panagiotis Papapetrou,et al.  A Shapley Value Approach for Influence Attribution , 2011, ECML/PKDD.

[19]  Y. Narahari,et al.  A Shapley Value-Based Approach to Discover Influential Nodes in Social Networks , 2011, IEEE Transactions on Automation Science and Engineering.

[20]  Filippo Menczer,et al.  BotOrNot: A System to Evaluate Social Bots , 2016, WWW.

[21]  Sinan Aral,et al.  The spread of true and false news online , 2018, Science.

[22]  Nicholas R. Jennings,et al.  Efficient Computation of the Shapley Value for Game-Theoretic Network Centrality , 2014, J. Artif. Intell. Res..

[23]  Filippo Menczer,et al.  Online Human-Bot Interactions: Detection, Estimation, and Characterization , 2017, ICWSM.