Facebook Oversight Board Recommends Overhaul of Controversial VIP Content Moderation Process

New York

After a year of review, Meta’s supervisory committee The company’s controversial system applies a different content moderation process to VIP posts and is designed to “satisfy business problem” and the risk of harm to everyday users.

In a nearly 50-page advisory that included more than two dozen suggestions for improving the program, the board — an entity funded by Meta but said to operate independently — called on the company to “fundamentally increase transparency” about “cross-checking” ” system and how it works. It also urges Meta to adopt Steps to hide potentially rule-breaking content from its most prominent users Under review to avoid further dissemination.

Cross-check procedure came under fire in November Report Reports from The Wall Street Journal noted that the system blocks some VIP users — meta-business partners such as politicians, celebrities, journalists and advertisers — from the company’s normal content review process, allowing in some cases They post content that breaks the rules with no consequences. As of 2020, the program has ballooned to 5.8 million users, according to The Wall Street Journal.

At the time, Meta said criticism of the system was fair, but that the cross-check was created to improve the accuracy of moderation for content that “may require more understanding.”

After the report was released, oversight board says Facebook failed to provide key details about the system, including as part of a board review of the company’s decision to suspend former U.S. President Donald Trump. The company responded by asking the supervisory board to review the cross-check system.

Essentially, the cross-check system means that when a user on the list posts something that is deemed to be in violation of Meta’s rules, the post is not immediately deleted (for normal users), but is left for further human review.

Meta said the program helps address “false negatives,” where content is removed despite not violating any rules against key users. But by subjecting users to a different process, Meta, by enabling human reviewers to provide the company’s full rules in their posts, “provides certain users with greater protection of”.

The board said that while the company “told the board that the purpose of the cross-check was to advance Meta’s human rights commitments, we found that the program was structured to more directly address the company’s concerns…We also found that Meta failed to track whether the data cross- Check the results to make more accurate decisions.”

This supervisory committee is an entity made up of experts in areas such as freedom of expression and human rights.it Often described as Meta’s supreme court, as it allows users to appeal content decisions on the company’s platform. While Meta has asked the Board for a review, it is under no obligation to follow its recommendations.

In a blog post published Tuesday, Nick Clegg, Meta’s president of global affairs, reiterated that the purpose of the cross-check is to “prevent potential over-enforcement … and to scrutinize cases where there may be a higher risk of error or the potential impact” of a mistake Especially serious. Meta plans to respond to the board’s report within 90 days, he said. Clegg also outlined several changes the company has already made to the plan, including formalizing the criteria for adding users for cross-checking and establishing an annual review for the list. .

Oversight committee raises concerns over delays in removing potentially offending content as part of wide-ranging consultation to restructure cross-examination By cross-checking users pending additional scrutiny, the company may allow content to do harm. It said that, according to Meta, “on average, it can take more than five days to reach a decision on user content on its cross-check list,” and that “the program’s operational backlog has led to delays in decision-making.”

“This means that, thanks to cross-checking, content found to be in violation of the Meta rules remains on Facebook and Instagram when it is most popular and likely to cause harm,” the oversight board said. It suggested that “high severity” content initially flagged as violating Meta’s rules should be removed or hidden from its platform with additional scrutiny, adding that “such content should not be posted simply because the poster is a business partner or a celebrity.”

The oversight board said Meta should develop and share transparent criteria for inclusion in its cross-check program, adding that users who meet the criteria should be able to apply to join the program. “The number of celebrities or followers a user has should not be the sole criterion for obtaining additional protections,” it said.

The committee also said certain categories of users protected by cross-checking should have their accounts publicly flagged, and suggested that additional scrutiny be prioritized for users whose content is “important to human rights,” rather than Meta business partners.

In the interests of transparency, “Meta should measure, audit and publish key metrics around its cross-check process to determine whether the process is operating effectively,” the board said.

Source link

Leave a Comment