Tech & Terrorism: New Study Confirms YouTube Algorithm Promotes Misinformation, Conspiracies, Extremism

(New York, N.Y.) — YouTube’s recommended videos algorithm suggests extremist content, misinformation, and conspiracy theories to its users, according to a new report by the Mozilla Foundation. The results were based on a crowdsourced research project that enabled thousands of YouTube users to report dangerous content through a web browser extension for Mozilla researchers to analyze. Of the videos reported, 71 percent of videos flagged by volunteers as harmful were recommended by YouTube’s algorithm.

The Mozilla Foundation’s report adds to an existing body of research demonstrating how the algorithms of major tech platforms play a key role in amplifying extremist and hateful content.

Counter Extremism Project (CEP) Senior Advisor Dr. Hany Farid, professor at UC Berkeley, recently discussed in a webinar the role of algorithmic amplification in promoting misinformation and divisive content online and its devastating consequences. Speaking on algorithmic amplification, Dr. Farid stated, “Algorithmic amplification is the root cause of the unprecedented dissemination of hate speech, misinformation, conspiracy theories, and harmful content online. Platforms have learned that divisive content attracts the highest number of users and as such, the real power lies with these recommendation algorithms.”

In March 2020, Dr. Farid and other UC Berkeley researchers authored a study, A Longitudinal Analysis Of YouTube’s Promotion Of Conspiracy Videos, that analyzed YouTube’s policies and efforts towards curbing its recommendation algorithm’s tendency to spread divisive conspiracy theories. After reviewing eight million recommendations over 15 months, researchers determined the progress YouTube claimed in June 2019 to have reduced the amount of time its users watched recommended videos including conspiracies by 50 percent—and in December 2019 by 70 percent—did not make the “problem of radicalization on YouTube obsolete nor fictional.” The study ultimately found that a more complete analysis of YouTube’s algorithmic recommendations showed the proportion of conspiratorial recommendations are “now only 40 percent less common than when the YouTube’s measures were first announced.”

CEP also conducted a study between August 2 and August 3, 2018, titled OK Google, Show Me Extremism, in which 649 YouTube videos were reviewed for extremist and counter-narrative content. Counter-narratives are a part of Google’s Redirect Method, which seeks to target individuals searching for ISIS-related content and direct them to counter-narrative videos that try to undermine the messaging of extremist groups. Of the 649 videos, CEP was four times more likely to encounter extremist material than counter-narratives. The result of CEP’s searches highlighted the extent of extremist content on YouTube and undermined YouTube’s claims touting the efficacy of its efforts to promote counter-narrative videos.

To watch a recording of the webinar, How Algorithmic Amplification Pushes Users Toward Divisive Content, please click here.

To read Dr. Farid’s report, A Longitudinal Analysis Of YouTube’s Promotion Of Conspiracy Videos, please click here.

To read CEP’s report, OK Google, Show Me Extremism: Analysis Of YouTube’s Extremist Video Takedown Policy And Counter-Narrative Program, please click here.

Daily Dose

Extremists: Their Words. Their Actions.

Fact:

On October 7, 2023, Hamas invaded southern Israel where, in the space of eight hours, hundreds of armed terrorists perpetrated mass crimes of brutality, rape, and torture against men, women and children. In the biggest attack on Jewish life in a single day since the Holocaust, 1,200 were killed, and 251 were taken hostage into Gaza—where 101 remain. One year on, antisemitic incidents have increased by record numbers. 

View Archive