Meta told to overhaul moderation system for high-profile users like Trump

Facebook parent company’s own ‘oversight board’ accused it of leaving dangerous content online to serve business interests

Meta has been told its treatment of high-profile users, such as former US president Donald Trump, left dangerous content online, serving business interests at the expense of its human rights obligations. Photograph: Jim Wilson/The New York Times
Meta has been told its treatment of high-profile users, such as former US president Donald Trump, left dangerous content online, serving business interests at the expense of its human rights obligations. Photograph: Jim Wilson/The New York Times

Meta has been told its treatment of high-profile users, such as former US president Donald Trump, left dangerous content online, serving business interests at the expense of its human rights obligations.

A damning report published on Tuesday from the company’s oversight board – a ‘Supreme Court’ style body created by the parent company of Facebook, Instagram and WhatsApp to rule on sensitive moderation issues – has urged the social media giant to make “significant” changes to its internal system for reviewing content from politicians, celebrities and its business partners.

The board, which started assessing cases last year, is co-ordinated by the tech giant’s policy chief and former UK deputy prime minister Nick Clegg and issues independent judgments on high-profile moderation cases as well as recommendations on certain policies.

The board was asked to look into the system, after the Wall Street Journal and whistleblower Frances Haugen revealed its existence for the first time last year, raising concerns that Meta was giving preferential treatment to elite figures.

READ SOME MORE

Mr Clegg also has until January 7 to decide whether to allow Trump back on to the platform following a separate recommendation by the board.

After a lengthy investigation spanning more than a year, the board has demanded that Meta more closely audit who is on the so-called “cross-check” list, and be more transparent about its review procedures.

The report is one of the most in-depth probes yet into moderation issues at Meta, as the independent body, comprising 20 journalists, academics and politicians, has grappled with concerns that it has little power to hold the company accountable.

Meta's board, which started assessing cases last year, is co-ordinated by the tech giant’s policy chief and former UK deputy prime minister Nick Clegg and issues independent judgments on high-profile moderation cases as well as recommendations on certain policies. Photograph: Etienne Laurent/AFP
Meta's board, which started assessing cases last year, is co-ordinated by the tech giant’s policy chief and former UK deputy prime minister Nick Clegg and issues independent judgments on high-profile moderation cases as well as recommendations on certain policies. Photograph: Etienne Laurent/AFP

It piles further pressure on to chief executive Mark Zuckerberg, who last month announced plans to cut 11,000 staff amid declining revenues and growth, to ensure Meta’s content is policed fairly.

Meta has already begun to revamp the system. In a blog post on Tuesday, Clegg said it was originally developed to “double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe”. He added that the company had now developed a more standardised system, with further controls and annual reviews.

It remains unclear how many people are currently on the secretive list. The Wall Street Journal, which first reported the list, estimated that by 2020, there were 5.8 million users listed. Meta has previously said there were 666,000 as of October 2021.

The system meant that content posted by well-known personalities, such as Trump and Elizabeth Warren, would remain on platforms until human moderators had reviewed them, even if the messages would have been automatically removed had they been posted by an ordinary user.

It would take five days on average for this human review to take place, with the content left on the platform during this time, and in one case, up to seven months, the report found.

Elon Musk is warned Twitter faces Europe-wide ban over content moderationOpens in new window ]

Meta’s “own understanding of the practical implications of the program was lacking”, the board said, adding that the company had failed to assess whether the system worked as intended.

The board also accused the company of giving “insufficient” responses to the investigation, sometimes taking months to respond.

The board referenced a Wall Street Journal report that detailed how Brazilian footballer Neymar posted non-consensual intimate imagery of another person on to his Facebook and Instagram accounts, which was viewed more than 50mn times before removal. According to Meta, this was because of a “delay in reviewing the content due to a backlog at the time”.

What will the easing of bankers’ pay restrictions do for competition dynamics?

Listen | 46:27

After Finance Minister Paschal Donohoe's surprise move to ease restrictions on pay and bonuses in the banking sector, we look at what it might mean for the three domestic banks and their international competitors. Markets Correspondent, Joe Brennan, also takes us through the rest of the headline-grabbing details in the 220 page Retail Banking Review. Ciaran is also joined by the Irish Times' Karlin Lillington to discuss the €265m fine handed down to Meta this week over its data protection breach. With fines now totalling over €900m, will it have made Mark Zuckerberg sit up and notice?

Thomas Hughes, director of the oversight board, said the Neymar incident was one example of how business partnerships could impact moderation processes.

“It opens up concerns ... about relationships between individuals in the company and whether that might influence decision-making,” he said.

“There was probably a conflation of different interests within this cross-check process,” he added.

The report follows previous public tensions between the board and Meta, after the former accused the social media company in September 2021 of withholding information on the system. Many see the board as an attempt to create distance between the company’s executives and difficult decisions around free speech.

Meta now has 90 days to respond to the recommendations. – Copyright The Financial Times Limited 2022