Facebook’s Belated Response to Christchurch Massacre Mirrors Previous Reactive Policy Changes

(New York, N.Y.) – A day ahead of the launch of the Christchurch Call to Action in Paris, a voluntary commitment by governments and tech companies to combat online extremism, Facebook announced new policies concerning the misuse of Facebook Live. Facebook’s attempt to garner positive headlines in the aftermath of a preventable tragedy while forestalling efforts by lawmakers to implement sensible and much-needed regulation is indicative of the company’s cynical and reactive response to misuse on its platforms. The Counter Extremism Project (CEP) has documented this troubling pattern of behavior in its resource, Tracking Facebook’s Policy Changes

On May 14, Facebook instituted a “one-strike policy” to ban those who violate its new Facebook Live rules. In a blog post, the tech giant said that “anyone who violates our most serious policies will be restricted from using Live for set periods of time – for example 30 days – starting on their first offense.”

It took Facebook eight weeks to implement this very limited policy, which should have been immediately implemented after the broadcast of the Christchurch mosque attacks on Facebook Live. Instead, the company delayed action until its meeting with the New Zealand prime minister and French president to ensure optimal positive media coverage.

The latest policy change also raises questions about other egregious safety gaps on Facebook platforms that have yet to be discovered. Facebook has a history of failing to identify serious issues before they arise and only apologizing and rushing quick fixes after significant harm has occurred.

Moreover, Facebook’s documented inability to implement previous policy pledges and promises raises serious doubts about whether this new measure will be enforced comprehensively and effectively.

For example, according to a recent whistleblower’s complaint to the Securities and Exchange Commission (SEC), Facebook has been misleading the public and its shareholders about the efficacy of its content moderation efforts. CEO Mark Zuckerberg testified in April 2018 before the U.S. Congress that Facebook was “[taking] down 99 percent of the Al Qaeda and ISIS-related content in our system before someone, a human even flags it to us.” However, the whistleblower alleges Facebook’s extremist content removal rate was just 38 percent, not 99 percent. 

The paradigm of self-regulation has led to serious misuse of Facebook and other tech industry platforms. Governments must be willing to issue new binding, non-voluntary mandates and enact new laws and regulations to force tech companies’ compliance. Such action will result in more effective practices to eliminate terrorist and extremist content online.

To read the report, Tracking Facebook’s Policy Changes, please click here.

Daily Dose

Extremists: Their Words. Their Actions.

Fact:

On October 7, 2023, Hamas invaded southern Israel where, in the space of eight hours, hundreds of armed terrorists perpetrated mass crimes of brutality, rape, and torture against men, women and children. In the biggest attack on Jewish life in a single day since the Holocaust, 1,200 were killed, and 251 were taken hostage into Gaza—where 101 remain. One year on, antisemitic incidents have increased by record numbers. 

View Archive