YouTube Child Exploitation Scandal Shows Once Again That Big Tech Only Acts Following Controversy

(New York, NY) – Counter Extremism Project (CEP) Executive Director David Ibsen released the following statement today after Google-owned YouTube was found to be facilitating pedophiles’ ability to connect and share content harmful to children. Since the incident, major companies like Disney, Nestle, Epic Games and others have pulled their advertising dollars from the platform, forcing YouTube to address the controversy.

“YouTube’s response to this latest incident involving the trade of harmful and predatory content follows a familiar playbook: only in the face of widespread public controversy – and after a real threat to their profit margins – will they take any action to eliminate this exploitative material from their platform.
 
“For more than a decade, YouTube’s response to platform misuse has been reactionary policies after the damage has already been done. In fact, in November 2017, YouTube faced this exact same controversy involving content that was harmful to minors on its platform. They had promised to do more then as well. Clearly, their past promises for better algorithms and improved technology are not working.

“YouTube’s promises to improve and pronouncements of policy changes are meaningless when the company fails to consistently and systematically enforce them. CEP has previously identified numerous instances of equally abhorrent content on their platform – from white supremacists advocating for violence to ISIS videos glorifying suicide bombings and brutal beheadings – which the company has failed to routinely keep off their platform. 
 
“YouTube’s behavior shows that the so-called self-regulation of the tech industry is no longer an option. Google and other tech firms have made it clear that they will only act when there is outside pressure – namely from advertisers and governments. Corporate advertisers and lawmakers should continue to pressure the tech industry, compelling them to finally take the necessary concrete measures to prevent the spread of hateful and harmful material on their sites and move past the PR tactics.”
 
YouTube’s negligence on such matters of safety and security is nothing new.  For years, YouTube ignored the tens of thousands of radical Islamist extremist Anwar al-Awlaki’s propaganda videos on its website. Although the website in a watershed moment finally in 2017 removed most of Awlaki’s materials online, some of his content still remains online today. Just last month, a CEP analysis found that dozens of Awlaki’s lectures are currently still easily locatable on YouTube, amassing tens of thousands of views. Some are months old, while others have been online for nearly a year. 
 
CEP has identified at least 16 previous instances in which Google and YouTube have made express policy changes only following public accusations, a scandal or pressure from lawmakers.  In November 2017, BuzzFeed News found exploitative YouTube videos of children in compromising, predatory or creepy situations. YouTube was also found to be auto-filling search results with pedophilic terms.  Back then, the company promised to use machine-learning technology and implement stricter controls on advertising for its users. 
 
While one would hope that Google is continuously working to improve security on YouTube and its other platforms, there is no excuse as to why so many policy changes have been reactionary, and it raises the question as to what other scandals are in the making due to still-undiscovered lapses in Google’s current policies. Two weeks ago, CEP identified instances of the neo-Nazi manifesto Siege, which calls for violence against the U.S. government as well as the killing of Jews, on YouTube. The company refused to take down the audiobook, citing a desire to protect free speech.
 
In August 2018, CEP released a report titled “OK Google, Show Me Extremism” that found extremist propaganda – including violent videos – were still readily accessible on YouTube, far exceeding counter-narrative content and undermining the company’s claims about its efforts to combat online extremism. To conduct the research, between August 2 and August 3, 2018, CEP reviewed 649 YouTube videos for extremist and counter-narrative content, based on searches for six terms related to Islamic extremism. The result of CEP’s searches highlighted the extent of the enduring problem of terrorist content on YouTube and undermined claims touting the efficacy of the company’s efforts to combat online extremism.

To view additional resources on Google-owned YouTube’s failures to remove extremist content on their platforms, please see below:

Daily Dose

Extremists: Their Words. Their Actions.

Fact:

On October 7, 2023, Hamas invaded southern Israel where, in the space of eight hours, hundreds of armed terrorists perpetrated mass crimes of brutality, rape, and torture against men, women and children. In the biggest attack on Jewish life in a single day since the Holocaust, 1,200 were killed, and 251 were taken hostage into Gaza—where 101 remain. One year on, antisemitic incidents have increased by record numbers. 

View Archive