A Longitudinal Analysis of YouTube's Promotion of Conspiracy Videos

Conspiracy theories have flourished on social media, raising concerns that such content is fueling the spread of disinformation, supporting extremist ideologies, and in some cases, leading to violence. Under increased scrutiny and pressure from legislators and the public, YouTube announced efforts to change their recommendation algorithms so that the most egregious conspiracy videos are demoted and demonetized. To verify this claim, we have developed a classifier for automatically determining if a video is conspiratorial (e.g., the moon landing was faked, the pyramids of Giza were built by aliens, end of the world prophecies, etc.). We coupled this classifier with an emulation of YouTube's watch-next algorithm on more than a thousand popular informational channels to obtain a year-long picture of the videos actively promoted by YouTube. We also obtained trends of the so-called filter-bubble effect for conspiracy theories.

[1]  Tomas Mikolov,et al.  Bag of Tricks for Efficient Text Classification , 2016, EACL.

[2]  Anatoliy A. Gruzd,et al.  Examining Sentiments and Popularity of Pro- and Anti-Vaccination Videos on YouTube , 2017, SMSociety.

[3]  Paul Covington,et al.  Deep Neural Networks for YouTube Recommendations , 2016, RecSys.

[4]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[5]  Michael Runcieman YouTube, the Great Radicalizer - The New York Times , 2018 .

[6]  Michael Runcieman How YouTube’s A.I. boosts alternative facts , 2018 .

[7]  Guido Caldarelli,et al.  Users Polarization on Facebook and Youtube , 2016, PloS one.

[8]  Tanushree Mitra,et al.  Conspiracies Online: User Discussions in a Conspiracy Community Following Dramatic Events , 2018, ICWSM.

[9]  Jürgen Buder,et al.  Reducing confirmation bias and evaluation bias: When are preference-inconsistent recommendations effective - and when not? , 2012, Comput. Hum. Behav..

[10]  Li Wei,et al.  Recommending what video to watch next: a multitask ranking system , 2019, RecSys.

[11]  Nitin Agarwal,et al.  Analyzing Disinformation and Crowd Manipulation Tactics on YouTube , 2018, 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM).

[12]  Anna Zaitsev,et al.  Algorithmic Extremism: Examining YouTube's Rabbit Hole of Radicalization , 2019, First Monday.

[13]  Derek Greene,et al.  Down the (White) Rabbit Hole: The Extreme Right and Online Recommender Systems , 2015 .

[14]  Joachim Allgaier Science and Environmental Communication on YouTube: Strategically Distorted Communications in Online Videos on Climate Change and Climate Engineering , 2019, Front. Commun..

[15]  Karen Spärck Jones A statistical interpretation of term specificity and its application in retrieval , 2021, J. Documentation.

[16]  Virgílio A. F. Almeida,et al.  Auditing radicalization pathways on YouTube , 2019, FAT*.

[17]  Jean-Loup Guillaume,et al.  Fast unfolding of communities in large networks , 2008, 0803.0476.