The Oversight Board, Facebook’s parent organisation Meta’s independent board overseeing its content moderation process, has found the company’s controversial cross-check programme meant for high-profile users “appears more directly structured to satisfy business concerns”, leads to unequal treatment of users and delayed removal of violating content.
The Oversight Board ruled there is a lack of transparency around how the mechanism works as Meta has provided only limited information to the public and users about the cross-check programme.
The Board released on Tuesday a 57-page advisory, laying out the reasons why the company’s special moderation programme could be risking the nature of content across its social networking platforms and promoting unequal treatment in case of everyday users.
The cross-check programme first surfaced to intense media scrutiny after The Wall Street Journal released an investigative report (based on internal documents) in September 2021, claiming rampant mismanagement and favouritism at the firm. The report, titled Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt, was released as part of the publication’s continued investigation into former Meta employee and whistleblower Frances Haugen’s explosive revelations. The report suggested that despite promises of treating all users alike, Meta, under its cross-check or XCheck programme, exempted a large number of politicians, celebrities, and other prominent faces from the company’s regular enforcement process.
Following WSJ’s disclosure, Meta submitted a request to Oversight Board to review the cross-check programme and guide the company on how it could be improved. At the time the Board was studying the policy advisory opinion, Meta shared it was performing about 100 million enforcement attempts on content every day, and while 99 per cent of the decisions were being made with accuracy, the remaining 1 percent still left room for a million mistakes a day. The cross-check programme, Meta added, responds to broader challenges in the moderation of large volumes of content.
And after nearly a year, the Board returned with its full review and a set of recommendations for Meta to adopt to improve the controversial cross-check programme. The Board informed “several shortcomings” were found in the cross-check programme, which “appears more directly structured to satisfy business concerns”. By providing extra protection to certain users, offensive content that would otherwise have been taken down under community guidelines remains visible for a longer period, hence causing harm.
“We also found that Meta has failed to track data on whether cross-check results in more accurate decisions, and we expressed concern about the lack of transparency around the programme,” says Oversight Board, adding that unequal treatment is particularly concerning given the lack of transparency around Meta’s cross-check lists.
The Oversight Board has made the following recommendations.
Meta should prioritise expression that is important for human rights, including expression which is of special public importance
“Users that are likely to produce this kind of expression should be prioritised for inclusion in lists of entities receiving additional review above Meta’s business partners. Posts from these users should be reviewed in a separate workflow, so they do not compete with Meta’s business partners for limited resources.”
Radically increase transparency around cross-check and how it operates
“Meta should measure, audit and publish key metrics around its cross-check programme so that it can tell whether the programme is working effectively. The company should set out clear, public criteria for inclusion in its cross-check lists, and users who meet these criteria should be able to apply to be added to them. Some categories of entities protected by cross-check, including state actors, political candidates and business partners, should also have their accounts publicly marked.”
Reduce harm caused by content left up during enhanced review – Content that violating the community guidelines
“Content identified as violating during Meta’s first assessment that is high severity should be removed or hidden while further review is taking place. Such content should not be allowed to remain on the platform accruing views simply because the person who posted it is a business partner or celebrity.”
The recommendations also involved splitting Meta’s content review systems into two categories: the company’s “human rights responsibilities” and “business priority”.
The Oversight Board was formed in 2018 at CEO Mark Zuckerberg’s direction and comprises human rights activists, lawyers, academics, and other experts. The main purpose of the Board is to guide Meta on disputed content moderation cases and it also accepts appeals directly from users seeking the removal or restoration of specific content on Meta platforms (Facebook, Instagram) after exhausting the appeals process on respective platforms. The Board makes decisions on a range of content: photos, videos, posts, texts, shares and comments.
Meta is not bound to adopt the Oversight Board’s recommendations. It must, however, respond to the Board’s assessments and decisions.