Facebook’s Oversight Board Releases Annual Report Calling For More Transparency From Company

June 29, 2022 CEP Staff

Board Has No Authority And Recommendations Are Not Binding

(New York, N.Y.) — Last week, Facebook’s Oversight Board released its first annual report since its formation in 2020. In it, the 20-member independent body, tasked with making decisions on contentious content issues, such as extremism and terrorism, stated that it “continues to have significant concerns, including around Meta’s [Facebook’s parent company] transparency and provision of information related to certain cases and policy recommendations.” Before the creation of the Oversight Board, Facebook faced years of criticism concerning its content moderation policies, including the platform’s failure to remove extremist and terrorist content in a consistent and transparent manner.

Rather than taking preventative measures, Facebook has too often jumped to make cosmetic policy changes to limit reputational damage. One day after the New York Times published a November 2018 exposé detailing how COO Sheryl Sandberg and other Facebook executives worked to downplay and spin bad news, CEO Mark Zuckerberg announced that the company would establish an independent body to oversee its content moderation systems, which eventually became the Oversight Board.

Counter Extremism Project (CEP) Executive Director David Ibsen stated, “In an attempt to appease critics and lawmakers, Facebook created the Oversight Board, specifically using the term ‘oversight’ and ‘board’ to imply some sort of authority. However, this is not a Board of Directors, with management or legal authority. Rather, the Oversight Board is more akin to an internal working group that relies solely on the company it is supposed to be overseeing for information and funding. Clearly, the Board is part of Meta’s public relations campaign aimed at obscuring the failures of self-regulation and muddying the calls for major reform and regulatory oversight.”

Ibsen continued, “If Meta is serious about its pledge to improve online safety, then it should ensure the Oversight Board is capable of true independent oversight with the right to access any and all information related to its inquiries. The company should also bring external experts who have core computer science skills, a proven track record of working for online safety, and a healthy skepticism of tech companies onto the Oversight Board, such as U.C. Berkeley professor and CEP Senior Advisor Dr. Hany Farid, a digital forensics expert who pioneered the leading technology that combats child sexual abuse material online. Meta should also integrate members of the Oversight Board onto its corporate board. Doing so would give teeth to any policy recommendations and demonstrate a true commitment to ensuring the platform is free from terrorism content and safe for users.”

Since its creation, the Oversight Board has ruled against Facebook 14 out of 20 times on content removal and Facebook has implemented about two thirds of the Board’s 86 policy recommendations. Tensions between the Board and Facebook reached a head last year, when Facebook’s controversial XCheck content moderation program came to light. The program provided a workaround against the company’s content moderation policies and provided exemptions to at least 5.8 million VIP users. These VIPs were allowed to post “rule-violating material” that harassed others and incited violence. Back then, the Board found that the company was not “fully forthcoming” about XCheck, and that its behavior was “not acceptable.”

To read CEP’s resource Updated: Tracking Facebook’s Policy Changes, please click here.

Daily Dose

Extremists: Their Words. Their Actions.

Fact:

On May 8, 2019, Taliban insurgents detonated an explosive-laden vehicle and then broke into American NGO Counterpart International’s offices in Kabul. At least seven people were killed and 24 were injured.

View Archive