Meta had a system to protect celebrity Facebook and Instagram accounts from moderation
Facebook and Instagram have a list of the most high-profile and commercially useful users on their services, and even their own parent company is saying its unethical.
Over the past year, Metas oversight board has been investigating Facebook and Instagrams little-known cross-check program, in the wake of a damning 2021 Wall Street Journal report detailing how the platforms protected millions of celebrity users from the companys enforcement and content policing protocols.
Meta tasked the oversight board, which is funded by the company but operates largely independently, with investigating the program in October of last year. The board finally released its findings on Tuesday in a publicly-available policy advisory to Meta. The opinion is that a big overhaul is needed for the cross-check program, which covers famous public figures including Meta CEO Mark Zuckerberg, U.S. Senator Elizabeth Warren, and former President Donald Trump.
The investigation uncovered several shortcomings in the way cross-check was used, chief among which was how the program was structured to satisfy business interests rather than enforce Metas commitment to human rights, as the company had claimed. The board also criticized Meta for failing to police rule-breaking originating from cross-check accounts on Facebook and Instagram, enacting a double standard under which misleading or harmful posts could stay online indefinitely if they were created by the privileged users.
The board also criticized the cross-check system for unequal treatment of users as Metas statements implied policies applied to all Facebook and Instagram users, when instead cross-checked accounts were at times exempted from platform rules.
Meta has repeatedly told the Board and the public that the same set of policies apply to all users, the report read. Such statements and the public-facing content policies are misleading, as only a small subset of content reaches a reviewer empowered to apply the full set of policies.
The board also criticized the cross-check system for unequal treatment of users as Metas statements implied policies applied to all Facebook and Instagram users, when instead cross-checked accounts were at times exempted from platform rules.
Meta has repeatedly told the Board and the public that the same set of policies apply to all users, the report read. Such statements and the public-facing content policies are misleading, as only a small subset of content reaches a reviewer empowered to apply the full set of policies.
Any mistake prevention system should prioritize expression which is important for human rights, including expression of public importance, the review said, urging Meta to take steps to optimize the program.
Cross-check failures
The call for Meta to review and overhaul its cross-check program came after several high-profile users were able to skate past Facebooks and Instagrams content moderating protocols much easier than most.
In 2019, Brazilian soccer star Neymar posted non-consensual sexual images of a woman who had previously accused him of rape on his Facebook and Instagram accounts, photos which were seen 56 million times and remained online for over a day, according to the Guardian. Moderators at Facebook and Instagram were unable to take down the posts immediately due to Neymars status as a cross-checked user, according to the WSJ report.
But even the Neymar incident was not enough for the cross-check feature to come to the attention of Metas oversight board, and the report criticized Meta for not making the cross-checked status of celebrity accounts transparent, even for internal review. The board did not directly investigate the cross-check program until 2021, when it was evaluating Donald Trumps ban from Facebook in the wake of the then-presidents involvement in the January 2021 Capitol Riots.
In its report, the oversight board detailed how Meta had originally envisioned cross-check as a mistake-prevention strategy that would help address over-enforcement of moderation protocols, or mistakenly removing content that does not violate Facebook or Instagram rules.
But the board also said that Meta seemed to prioritize under-enforcing moderation as opposed to over-enforcing, seemingly out of fear policing would come across as censorship.
Meta stated that it prefers under-enforcement compared to over-enforcement of cross-checked content, the report read, adding that the perception of censorship was seen at Meta as a potentially significant hit to the companys business interests.
Metas oversight board made a total of 32 recommendations to the company on how to overhaul the program, including more transparency and a larger focus on equality among users.
A Meta spokesperson told Fortune that the company will begin reviewing the recommendations now and share its response in 90 days.
Our new weekly Impact Report newsletter will examine how ESG news and trends are shaping the roles and responsibilities of todays executivesand how they can best navigate those challenges. Subscribe here.