Content Removal Policies Must be Applied Consistently Across all Types of Extremism

October 11, 2019 CEP Staff
Facebook’s Latest Reactive Policy Change Meaningless Without Proper Enforcement

Facebook recently made modifications to its Dangerous Individuals and Organizations policy, in an attempt to limit extremism on its platform. Amongst other changes, the company updated its definition of terrorist organizations from “acts of violence intended to achieve a political or ideological aim” to include “attempts at violence, particularly when directed toward civilians with the intent to coerce and intimidate.” The announcement was another in a series of reactive moves Facebook has made since the Christchurch shooter used Facebook Live to broadcast his attack. The company has also been heavily criticized for allowing the El Paso’s shooter’s manifesto, which was initially posted on 8chan, to remain on its platform. 

Policy improvements announced in the wake of tragedies represent more of the same piecemeal and ad hoc approaches to simply address the latest problems. Moreover, if not coupled with expanded capacity, dedication of resources, deployment of proven technologies such as hashing, and ongoing research and development, definitional updates will not ensure clear and consistent enforcement across all forms of extremism.   

“Facebook and other tech firms are free to set the terms for the content they allow and do not allow on their platforms. However, what regulators and the public require is that these new standards are clear, applied universally, and enforced effectively,” said Counter Extremism Project (CEP) Executive Director David Ibsen. “As tech companies continue to expand their definitions of extremism and terrorism, it would be reasonable to expect that those standards be applied with the same speed and vigor to different types of ideologies—whether it be white supremacist or Islamist. All types of extremist ideologies are dangerous and must be confronted equally.”

In May, CEP outlined Facebook’s inconsistences regarding its new policy change targeting white supremacist groups that are both violent and non-violent—a distinction it chose to do away with after the Christchurch shootings. However, the company failed to apply the new policy across all ideologies. For instance, radical Islamist organizations such as Hizb ut-Tahrir (HT) and the Muslim Brotherhood have inspired generations of terrorists and murderers. Both groups maintain a presence on Facebook, Instagram, and Twitter. A Muslim Brotherhood leader, Yusuf al-Qaradawi, is particularly active on Facebook and has called for attacks on American and Israeli civilians and troops, and the execution of LGBTQ persons.

For more than a decade, Facebook has amassed a track record of reactive policy changes made only after widespread public criticism. Although Facebook has made some progress since its creation, it still often jumps to make policy improvements after the damage has been done. This reactive approach has contributed to Facebook’s continued inconsistent and ineffective enforcement of its standards.

Daily Dose

Extremists: Their Words. Their Actions.

Fact:

On October 7, 2023, Hamas invaded southern Israel where, in the space of eight hours, hundreds of armed terrorists perpetrated mass crimes of brutality, rape, and torture against men, women and children. In the biggest attack on Jewish life in a single day since the Holocaust, 1,200 were killed, and 251 were taken hostage into Gaza—where 101 remain. One year on, antisemitic incidents have increased by record numbers. 

View Archive