Over the weekend, Twitter users successfully uploaded a video of the 2019 Christchurch shootings to the platform. The video, which was livestreamed by Brenton Tarrant who attacked two mosques and killed 51 Muslim worshippers, was only removed after being reported to the company by the government of New Zealand. Twitter’s terms of service state that the company prohibits content that “promote[s] terrorism or violent extremism,” and CEO Elon Musk has pledged that users would be suspended for posting content that is illegal or an “incitement to violence.”
Following the Christchurch attack, major tech companies, including Twitter, signed the Christchurch Call to Action, a nine-point action plan aimed at fighting terrorism and violent extremism online. The nine action points included a pledge to invest in new technologies to improve terrorist content detection and removal, a commitment to implementing live streaming checks to reduce risks of disseminating terrorist content, and, among other things, a promise to improve sharing technological developments between large and small companies.
Although content moderation is at the forefront of many policy conversations, the content in question here—a video from a terrorist attack—should be treated as something that must unassailably be removed, much as child sexual abuse material (CSAM) or drug trafficking content, because it continues to inspire further violence. The shooter in the May 2022 Buffalo Attack, for example, viewed a clip of the Christchurch shooting on 4chan and credited the attack as his inspiration in his online manifesto.
Unfortunately, the Counter Extremism Project (CEP) continues to find the Christchurch video, clips of the video and support for the attack on a variety of platforms that signed the Christchurch Call to Action and other online sites. Ahead of the one-year anniversary of the Christchurch attack in 2020, CEP researchers found the attack video on 17 online locations. Separately in August 2022, CEP located three Twitter accounts that glorified the terrorist attack and combined had nearly 2,000 followers. One of these accounts remained on Twitter for almost four months.