Regulatory Measures Needed to Rein in Tech Companies’ Failures & Inaction to Remove Horrific, Violence Inciting Content
(New York, NY) – Counter Extremism Project (CEP) Executive Director David Ibsen released the following statement in response to tech giants releasing figures on how much hateful content they blocked or removed under Germany’s Network Enforcement Act (NetzDG) as well as reports that the European Union is planning to propose new rules regulating steps tech companies, such as Google, Twitter and Facebook, must follow when taking down extremist content from their platforms:
“The European Union’s most recent decision to implement more stringent efforts on tech companies in the fight against extremism is acknowledgement that the industry’s preferred paradigm of so-called ‘self-regulation’ has failed. Moreover, figures issued by Facebook and Google/YouTube last week – as required under the German NetzDG law – illustrate how government action can increase transparency and accountability from the tech industry and provide the right incentives in the fight against online extremism. Tech corporations have continually been proven to prioritize profits above all, even the public’s safety. These latest rounds of pressure from lawmakers will hopefully serve as a wake-up call that profits should never be placed above human lives.”
YouTube claimed to have removed 58,297 pieces of content based on 214,827 complaints received under the NetzDG, whereas Facebook dealt only with 1,704, of which only 362 were deleted or blocked. These figures, however, fail to explain why extremist and terrorist material remains on these popular platforms. For example, recent research from the Counter Extremism Project (CEP) shows that this dangerous content is still widely distributed on Facebook and can be accessed by users with just a few clicks (Zuckerberg's 99% Myth Exposed).
YouTube’s serious deficits in handling ISIS videos were recently revealed in a study undertaken by CEP and Dr. Hany Farid, the world’s foremost digital forensics and hashing expert. From March 8 to June 8, 2018, CEP conducted research to better understand how ISIS content is being uploaded to YouTube, how long it stays online and how many views these videos receive. To accomplish this, CEP conducted a limited search for a small set of just 229 previously identified ISIS terror-related videos. The results of the study clearly show that even though most of the identified videos are removed fairly quickly, they still manage to garner thousands of views. Additionally, known ISIS videos are being re-uploaded time and again, further threatening the public.