Facebook has said that a $52 million (€48 million) settlement paid to US content moderators to end legal claims for mental health damage does not apply to litigants suing in Ireland.
As part of the settlement, the internet giant is paying a minimum of $1,000 to each moderator and possibly more in compensation if they are diagnosed with mental disorders, including post-traumatic stress disorder, from their work censoring disturbing content posted on the network.
The moderators, typically employed through third-party contractors such as recruitment firm CPL and consultants Accenture in Ireland, must watch graphic content such as child sexual abuse, terrorism and animal cruelty to screen the material to decide whether it should be removed.
Facebook is facing similar legal actions from employees and contractors who moderated content for the social network for its Dublin operations. Facebook’s Irish spokeswoman said the settlement relates to the US only and would not comment on “active litigation”.
A spokesman for Coleman Legal, the Dublin law firm instructed by 20 content moderators in Ireland claiming distress from their work, welcomed the settlement as it appeared to show that Facebook has accepted that moderators "can be injured as a result of over-exposure to the graphic nature of the work", although the exact terms of the settlement have yet to be seen.
The settlement also seemed to show that Facebook seemed “to have now overridden” the third party company and acknowledged that moderators who have worked on Facebook content should be compensated regardless of who their employer is, the firm said.
The law firm said that it would be proceeding with the Irish cases regardless of the settlement.
‘Paltry sums’
Cori Crider, director of Foxglove, a UK litigation non-profit group advising and campaigning to raise awareness of the Irish legal actions, said that the levels of compensation in the US settlement were "fairly paltry sums" being offered to people with PTSD.
The level of psychological damage suffered by moderators should be a “matter for individual medical assessment” and this is what litigants expect to happen in the Irish case, she said.
For "thousands of lives shattered," Ms Crider said, Facebook's chief executive Mark Zuckerberg "should have paid far more."
"It will take years for people to recover from what they had to see to keep us all safe," said Ms Crider, whose firm is assisting Irish litigant Chris Gray in his personal injury action against Facebook and other litigants suing the courts in Europe.
More than 11,000 people who have worked as moderators for Facebook in California, Arizona, Texas and Florida since 2015 will quality for compensation, including awards of potentially up to $50,000 per moderator, according to lawyers for the US plaintiffs who sued for California.
Mr Gray, who was a “community operations analyst” for Facebook employed through CPL, described the settlement as “derisory”.
There is “a lot of spin and PR” in the settlement, he said.
Mr Gray filed his High Court legal action in Dublin in December against Facebook Ireland and CPL Solutions, a subsidiary of the recruitment group.
He has said he moderated Facebook content including violence against children and mass murders, said it was positive that “at least Facebook seem to be acknowledging that PTSD among its moderators is a risk”.
“But the money is derisory. They seem to be saying: here’s your money, now bugger off ... The items in the settlement are also not wholly new,” he said.
“For example, there is no commitment to outside experts coming in to monitor any of the efforts. They are still not acknowledging that there is a whole new world out there reflected in the often violent posts that need to be moderated that nobody has had to deal with before. And we don’t know how.”
The US legal action was taken by a former moderator Selena Scola in a California state court in 2018. Several other former employees later joined the case as co-plaintiffs.
Support
Facebook’s Dublin spokeswoman said that the company was “committed to providing support for everyone who reviews content for Facebook, as we recognise that reviewing certain types of content can sometimes be difficult.”
She said that moderators had access to mental health counsellors, including 24/7 on-site support, an on-call service and access to private health care.
“We are also globally employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue and we’re continually looking at ways to expand support to content reviewers globally,” said the Facebook spokeswoman.
Ms Crider said that the moderators “need doctors, not a life coach” and that the “current conditions of content moderation create medical issues that require extensive and rigorous medical support”.