Moderators to take Facebook to court for ‘psychological trauma’

One described witnessing Isis executions, child exploitation and animal torture

Facebook. File photograph: Loic Venance/AFP/Getty
Facebook. File photograph: Loic Venance/AFP/Getty

The Personal Injuries Assessment Board has given the go-ahead to a group of former content moderators to serve proceedings against Facebook in the High Court.

It is understood that more than five former moderators are seeking damages for personal injuries caused by the disturbing content they were exposed to during their employment by CPL in Dublin, on behalf of Facebook.

"The Personal Injuries Assessment Board has commenced authorising the issuing of High Court proceedings against Facebook," said Diane Treanor, solicitor with Dublin law firm Coleman Legal Partners, who is representing the moderators.

Under section 17 of the Personal Injuries Assessment Board Act 2003, if a plaintiff’s injury consists of psychological damage that would be difficult to assess by the board, it can give permission for the claim to be pursued through the courts.

READ SOME MORE

The former content moderators have undergone medico-legal evaluation by a consultant psychiatrist at Bons Secours hospital. They will claim they are suffering psychological trauma as a result of both the graphic and disturbing content and the working conditions, which they will say fostered a constant fear of failure.

The moderators were among about 15,000 people in 20 locations around the world, whose job it was to decide what content should be allowed to stay on Facebook, what should be left up and marked as “disturbing”, and what should be deleted.

Graphic content

One former moderator, Chris Gray, described witnessing Islamic State executions, murders, beatings, child exploitation and the torture of animals as part of his role.

Chris Gray, a former Facebook content moderator. Photograph: Alan Betson
Chris Gray, a former Facebook content moderator. Photograph: Alan Betson

Compounding the stress was the fact that “you’re in constant jeopardy of making a wrong decision and being penalised for it. That multiplies the stress when you’re looking at difficult content,” Mr Gray said at the time.

He described how the moderation process works: “There are grades of decision making, and if you get it wrong by just a little bit it still counts as a mistake. And that counts [against] your quality score, and you might be fired. You’re not just looking at it objectively; you’re trying to second-guess the system.”

It was reported earlier this year that three former content moderators had launched a class-action lawsuit against Facebook in California. The US workers alleged they suffered from symptoms of post-traumatic stress disorder as a result of the repeated viewing of violent videos. The lawsuit alleges that Facebook violated California law, by failing to provide a safe workplace for these workers.

Jennifer O'Connell

Jennifer O'Connell

Jennifer O’Connell is Opinion Editor with The Irish Times