Evaluating Platform Accountability: Terrorist Content on YouTube

YouTube has traditionally been singled out as particularly influential in the spreading of ISIS content. However, the platform along with Facebook, Twitter, and Microsoft jointly created the Global Internet Forum to Counter Terrorism in 2017 as one mode to be more accountable and take measures towar...

Full description

Saved in:
Bibliographic Details
Published inThe American behavioral scientist (Beverly Hills) Vol. 65; no. 6; pp. 800 - 824
Main Author Murthy, Dhiraj
Format Journal Article
LanguageEnglish
Published Los Angeles, CA SAGE Publications 01.05.2021
SAGE PUBLICATIONS, INC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:YouTube has traditionally been singled out as particularly influential in the spreading of ISIS content. However, the platform along with Facebook, Twitter, and Microsoft jointly created the Global Internet Forum to Counter Terrorism in 2017 as one mode to be more accountable and take measures toward combating extremist content online. Though extreme content on YouTube has been found to have decreased substantially due to this and other efforts (human and machine-based), it is valuable to historically review what role YouTube previously had in order to better understand the evolution of contemporary moves toward platform accountability in terms of extremist video content sharing. Therefore, this study explores what role YouTube’s recommender algorithm had in directing users to ISIS-related content prior to large-scale pressure by citizens and governments to more aggressively moderate extremist content. To investigate this, a YouTube video network from 2016 consisting of 15,021 videos (nodes) and 190,087 recommendations between them (edges) was studied. Using Qualitative Comparative Analysis, this study evaluates 11 video attributes (such as genre, language, and radical keywords) and identifies sets of attributes that were found to potentially be involved in the outcomes of YouTube recommending extreme content. This historical review of YouTube at a unique point in platform accountability ultimately raises questions of how platforms might be able to be more proactive rather than reactive regarding filtering and moderating extremist content.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0002-7642
1552-3381
1552-3381
DOI:10.1177/0002764221989774