Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access June 8, 2020

The YouTube Algorithm and the Alt-Right Filter Bubble

  • Lauren Valentino Bryant EMAIL logo
From the journal Open Information Science


The YouTube algorithm is a combination of programmed directives from engineers along with learned behaviors that have evolved through the opaque process of machine learning which makes the algorithm’s directives and programming hard to understand. Independent tests to replicate the algorithm have shown that the algorithm has a strong bias towards right-leaning politics videos, including those racist views expressed by the alt-right community. While the algorithm seems to be pushing users towards the alt-right video content merely in an attempt to keep users in a cycle of video watching, the end result makes YouTube a powerful recruiting tool for Neo-Nazis and the alt-right. The filter bubble effect that this creates pushes users into a loop that reinforces radicalism instead of level-headed factual resources.


Bergen, M. (2019, April 2). YouTube executives ignored warnings, letting toxic videos run rampant. Bloomberg. Retrieved from in Google Scholar

Covington, P., Adams, J., & Sargin, E. (2016). Deep neural networks for YouTube recommendations. RecSys ‘16 Proceedings of the 10th ACM Conference on Recommender Systems, pp. 191-198. in Google Scholar

Evans, R. (2018, October 11). From memes to Infowars: How 75 fascist activists were “red-pilled” [Blog post]. Retrieved from Bellingcat website: in Google Scholar

Hate speech policy. (2019, June 5). Retrieved October 28, 2019, from YouTube Help website: in Google Scholar

Hayden, M. E. (2019, August 04). White nationalists praise El Paso attack and mock the dead. Southern Poverty Law Center. Retrieved from: in Google Scholar

Kaiser, J., & Rauchfleisch, A. (2018, April 11). Unite the right? How YouTube’s recommendation algorithm connects the U.S. far-right. Retrieved October 28, 2019, from Medium website: in Google Scholar

Langer, G. (2017, August 21). Trump approval is low but steady; On Charlottesville lower still. ABC News/Washington Post. Retrieved from in Google Scholar

Mock, B. (2007, April 20). Neo-Nazi groups share hate via YouTube. Southern Poverty Law Center. Retrieved from in Google Scholar

Newton, Casey. (2019, December 16). The terror queue: These moderators help keep Google and YouTube free of violent extremism—and now some of them have PTSD. The Verge. Retrieved from in Google Scholar

O’Donovan, C., Warzel, C., McDonald, L., Clifton, B., & Woolf, M. (2019, January 24). We followed YouTube’s recommendation algorithm down the rabbit hole. Retrieved October 28, 2019, from BuzzFeed News website: in Google Scholar

Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. Retrieved from in Google Scholar

Thompson, N. (2018, March 15). Susan Wojcicki on YouTube’s fight against misinformation. Wired. Retrieved from in Google Scholar

Tufekci, Z. (2018, March 10). YouTube, the great radicalizer. The New York Times, Opinion. Retrieved from in Google Scholar

Received: 2019-10-31
Accepted: 2020-04-09
Published Online: 2020-06-08

© 2020 Lauren Valentino Bryant, published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 8.12.2023 from
Scroll to top button