Facebook Oversight Board recommends overhaul of controversial content moderation process for VIPs | CNN Business




New York
CNN Business
 — 

After a yearlong review, Meta’s Oversight Board on Tuesday said the company’s controversial system that applies a different content moderation process for posts from VIPs is set up to “satisfy business concerns” and risks doing harm to everyday users.

In a nearly 50-page advisory, including more than two dozen recommendations for improving the program, the board — an entity financed by Meta but which says it operates independently -— called on the company to “radically increase transparency” about the “cross check” system and how it works. It also urged Meta to take steps to hide content from its most prominent users that potentially violates rules while it’s under review in order to avoid spreading it further.

The cross-check program came under fire last November after a report from the Wall Street Journal indicated that the system shielded some VIP users — such as politicians, celebrities, journalists and Meta business partners like advertisers — from the company’s normal content moderation process, in some cases allowing them to post rule-violating content without consequences. As of 2020, the program had ballooned to include 5.8 million users, the Journal reported.

At the time, Meta said that criticism of the system was fair, but that cross-check was created in order to improve the accuracy of moderation on content that “could require more understanding.”

In the wake of the report, the Oversight Board said that Facebook had failed to provide crucial details about the system, including as part of the board’s review of the company’s decision to suspend former US President Donald Trump. The company in response requested that the Oversight Board review the cross-check system.

In essence, the cross-check system means that when a user on the list posts content identified as breaking Meta’s rules, the post is not immediately removed (as it would be for regular users) but instead is left up pending further human review.

Meta says that the program helps address “false negatives” where content is removed despite not breaking any of its rules for key users. But by subjecting cross-check users to a different process, Meta “grants certain users greater protection than others,” by enabling a human reviewer to provide the full range of the company’s rules to their posts, the Oversight Board said in its Tuesday report.

The board said that while the company “told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns … We also found that Meta has failed to track data on whether cross-check results in more accurate decisions.”

The Oversight Board is an entity made up of experts in areas such as freedom of expression and human rights. It is often described as a kind of Supreme Court for Meta as it allows users to appeal content decisions on the company’s platforms. Although Meta requested the board’s review, it is not under any obligation to incorporate its recommendations.

In a blog post published Tuesday, Meta President of Global Affairs Nick Clegg reiterated that cross-check aims to “prevent potential over-enforcement … and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe.” He said Meta plans to respond to the board’s report within 90 days. Clegg also outlined several changes the company has already made to the program, including formalizing criteria for adding users to cross-check and establishing annual reviews for the list.

As part of a wide-ranging advisory for restructuring cross-check, the Oversight Board raised concerns that by delaying removal of potentially violative content by cross-check users pending additional review, the company could be allowing the content to cause harm. It said that according to Meta, “on average, it can take more than five days to reach a decision on content from users on its cross-check lists,” and that “the program has operated with a backlog which delays decisions.”

“This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm,” the Oversight Board said. It recommended that “high severity” content initially flagged as violating Meta’s rules should be removed or hidden on its platforms while undergoing additional review, adding, “such content should not be allowed to remain on the platform simply because the person who posted it is a business partner or celebrity.”

The Oversight Board said that Meta should develop and share transparent criteria for inclusion in its cross-check program, adding that users who meet the criteria should be able to apply for inclusion into the program. “A user’s celebrity or follower count should not be the sole criterion for receiving additional protection,” it said.

The board also said some categories of users protected by cross-check should have their accounts publicly marked, and recommended that users whose content is “important for human rights” be prioritized for additional review over Meta business partners.

For the sake of transparency, “Meta should measure, audit, and publish key metrics around its cross-check program so it can tell whether the program is working effectively,” the board said.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *