Sunday, October 27, 2019

Any “radicalization” that occurs on YouTube happens according to the standard model of persuasion: People adopt new beliefs about the world by combining their prior beliefs with new information

A Supply and Demand Framework for YouTube Politics. Kevin Munger & Joseph Phillips. Penn State Political Science, October 1, 2019. https://osf.io/73jys/

Abstract: Youtube is the most used social network in the United States. However, for a combination of sociological and technical reasons, there exist little quantitative social science research on the political content on Youtube, in spite of widespread concern about the growth of extremist YouTube content. An emerging journalistic consensus theorizes the central role played by the video "recommendation engine," but we believe that this is premature. Instead, we propose the "Supply and Demand" framework for analyzing politics on YouTube. We discuss a number of novel technological affordances of YouTube as a platform and as a collection of videos, and how each might drive supply of or demand for extreme content. We then provide large-scale longitudinal descriptive information about the supply of and demand for alternative political content on YouTube. We demonstrate that viewership of far-right videos peaked in 2017.

2 The “Zombie Bite” Theory of YouTube
Beginning with Bridle (2017)’s viral essay about horrifying content auto-recommended to children and extended to the realm of adult politics with journalistic enterprises like Nicas (2018) and Tufekci (2018), a single narrative has emerged: YouTube audiences are at risk of far-right radicalization and this is because the YouTube algorithm that was designed to maximize the company’s profits via increased audience time on the platform has learned to show people far-right videos.6

A working paper published online by Ribeiro et al. (2019) in August 2019 is by far the most rigorous and comprehensive analysis of YouTube radicalization to date. They find compelling evidence of commenter overlap between videos uploaded by the three ideological communities: the “Alt-Lite,” the “Intellectual Dark Web,” and the “Alt-Right” (we discuss this typology and propose an alternative typology below). The paper demonstrates that many of the commenters on “Alt-Right” videos had previously commented on videos from the other camps. This is valuable descriptive information, and it enables the scholarly community to better theorize about causal relationships of interest. However, this is not itself evidence in favor of any given theory of the underlying causal process that explains Alt-Right viewership.

Ribeiro et al. (2019)’s conclusion admits as much: “Our work resonates with the narrative that there is a radicalization pipeline...Indeed, we manage to measure traces of this phenomenon using commenting users.”

The status of the “radicalization pipeline” is indeed best characterized as a “narrative,” rather than a theory. The chronological fact of people watching and commenting on Alt-Lite videos before moving onto Alt-Right videos is undeniable. But what model of the world does this call into question? Presented with the descriptive fact of “AltRight” creators with sizeable audiences on YouTube, did any theorize the existence of some kind of ideological discontinuity in the media that audience had previously consumed?

Indeed, the most plausible mechanism by which a viewership discontinuity might occur is the recommendation engine. But despite considerable energy, Ribeiro et al. (2019) fail to demonstrate that the algorithm has a noteworthy effect on the audience for Alt-Right content. A random walk algorithm beginning at an Alt-Lite video and taking 5 steps randomly selecting one of the ten recommended videos will only be recommended a video from the Alt-Right approximately one out every 1,700 trips. For a random walker beginning at a “control” video from the mainstream media, the probability is so small that it is difficult to see on the graph, but it is certainly no more common than one out of every 10,000 trips.7

In short, the best quantitative evidence available demonstrates that any “radicalization” that occurs on YouTube happens according to the standard model of persuasion: people adopt new beliefs about the world by combining their prior beliefs with new information (Guess and Coppock, 2018). People select information about topics that interest them; if political, they prefer information that is at least some what congenial to their prior beliefs (Stroud, 2017). Persuasion happens at the margins when it does happen.

The “Zombie Bite” theory is, of course, something of a straw man; no one has fully articulated and defended it. [...]

No comments:

Post a Comment