The Interconnected Nature of Online Harm and Moderation: Investigating the Cross-Platform Spread of Harmful Content between YouTube and Twitter

The proliferation of harmful content shared online poses a threat to online information integrity and the integrity of discussion across platforms. Despite various moderation interventions adopted by social media platforms, researchers and policymakers are calling for holistic solutions. This study explores how a target platform could leverage content that has been deemed harmful on a source platform by investigating the behavior and characteristics of Twitter users responsible for sharing moderated YouTube videos. Using a large-scale dataset of 600M tweets related to the 2020 U.S. election, we find that moderated Youtube videos are extensively shared on Twitter and that users who share these videos also endorse extreme and conspiratorial ideologies. A fraction of these users are eventually suspended by Twitter, but they do not appear to be involved in state-backed information operations. The findings of this study highlight the complex and interconnected nature of harmful cross-platform information diffusion, raising the need for cross-platform moderation strategies.

[1]  D. Watts,et al.  Deplatforming did not decrease Parler users’ activity on fringe social media , 2023, PNAS nexus.

[2]  S. Cresci,et al.  Tracking Fringe and Coordinated Activity on Twitter Leading Up To the US Capitol Attack , 2023, ArXiv.

[3]  Matthew R. DeVerna,et al.  One Year of COVID-19 Vaccine Misinformation on Twitter: Longitudinal Study , 2022, Journal of medical Internet research.

[4]  Luca Verginer,et al.  Spillover of Antisocial Behavior from Fringe Platforms: The Unintended Consequences of Community Banning , 2022, ICWSM.

[5]  Wagner Meira, Jr,et al.  Characterizing Vaccination Movements on YouTube in the United States and Brazil , 2022, HT.

[6]  Jacob N. Shapiro,et al.  Removal of Anti-Vaccine Content Impacts Social Media Discourse , 2022, WebSci.

[7]  Felipe Cardoso,et al.  The Disinformation Dozen: An Exploratory Analysis of Covid-19 Disinformation Proliferation on Twitter , 2022, WebSci.

[8]  Milo Z. Trujillo,et al.  Characterizing YouTube and BitChute Content and Mobilizers During U.S. Election Fraud Discussions on Twitter , 2022, WebSci.

[9]  M. Mathioudakis,et al.  Rewiring What-to-Watch-Next Recommendations to Reduce Radicalization Pathways , 2022, WWW.

[10]  Iain J. Cruickshank,et al.  Cross-platform spread: vaccine-related content, sources, and conspiracy theories in YouTube videos shared in early Twitter COVID-19 conversations , 2022, Human vaccines & immunotherapeutics.

[11]  Kai-Cheng Yang,et al.  Botometer 101: social bot practicum for computational social scientists , 2022, Journal of Computational Social Science.

[12]  Joshua A. Tucker,et al.  Twitter flagged Donald Trump’s tweets with election misinformation: They continued to spread both on and off the platform , 2021, Harvard Kennedy School Misinformation Review.

[13]  Gianluca Stringhini,et al.  Understanding the Effect of Deplatforming on Social Networks , 2021, WebSci.

[14]  Gianluca Stringhini,et al.  A Multi-Platform Analysis of Political News Discussion and Sharing on Web Communities , 2021, 2021 IEEE International Conference on Big Data (Big Data).

[15]  Mor Naaman,et al.  VoterFraud2020: a Multi-modal Dataset of Election Fraud Claims on Twitter , 2021, ICWSM.

[16]  Emilio Ferrara,et al.  #Election2020: the first public Twitter dataset on the 2020 US Presidential election , 2020, Journal of Computational Social Science.

[17]  Cody Buntain,et al.  YouTube Recommendations and Effects on Sharing Across Online Social Platforms , 2020, Proc. ACM Hum. Comput. Interact..

[18]  Shivakant Mishra,et al.  “Video Unavailable”: Analysis and Prediction of Deleted and Moderated YouTube Videos , 2020, 2020 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM).

[19]  Emilio Ferrara,et al.  Characterizing social media manipulation in the 2020 U.S. presidential election , 2020, First Monday.

[20]  Cody Buntain,et al.  Cross-Platform State Propaganda: Russian Trolls on Twitter and YouTube during the 2016 U.S. Presidential Election , 2020, The International Journal of Press/Politics.

[21]  A. L. Schmidt,et al.  The COVID-19 social media infodemic , 2020, Scientific Reports.

[22]  Thomas E. Powell,et al.  A Picture Paints a Thousand Lies? The Effects and Mechanisms of Multimodal Disinformation and Rebuttals Disseminated via Social Media , 2020, Political Communication.

[23]  Evelyn Douek The Rise of Content Cartels , 2020, SSRN Electronic Journal.

[24]  Emilio Ferrara,et al.  Detecting Troll Behavior via Inverse Reinforcement Learning: A Case Study of Russian Trolls in the 2016 US Election , 2020, ICWSM.

[25]  Tom Wilson,et al.  Cross-Platform Disinformation Campaigns: Lessons Learned and Next Steps , 2020, Harvard Kennedy School Misinformation Review.

[26]  Virgílio A. F. Almeida,et al.  Auditing radicalization pathways on YouTube , 2019, FAT*.

[27]  Chris Wells,et al.  Disinformation, performed: self-presentation of a Russian IRA account on Twitter , 2019, Information, Communication & Society.

[28]  Kristina Lerman,et al.  Who Falls for Online Political Manipulation? , 2018, WWW.

[29]  Venkata Rama Kiran Garimella,et al.  Who watches (and shares) what on youtube? and when?: using twitter to understand youtube viewership , 2013, WSDM.

[30]  Eric P. Xing,et al.  Sparse Additive Generative Models of Text , 2011, ICML.