Subscriber OnlyCrime & Law

Sting operations, AI and a national database: How Irish investigators aim to tackle ‘explosion’ in online child sex abuse

Hotline.ie chief executive Mick Moran is at the forefront of efforts to unearth the origins of child sexual abuse material in Ireland and removing its subjects from harm

Mick Moran, chief executive of Hotline.ie: 'When we see new material coming online, oftentimes the abuse is actually happening right now in a live setting.' Photograph: Bryan O’Brien
Mick Moran, chief executive of Hotline.ie: 'When we see new material coming online, oftentimes the abuse is actually happening right now in a live setting.' Photograph: Bryan O’Brien

When viewing an image of online child abuse, investigators like Mick Moran have one thing on their minds: “Who is the kid? How do we figure out their identity?”

Catching who created the image and taking the material off the internet are obviously matters of urgency, he says, but finding the child and removing them from harm must take precedence. “When we see new material coming online, oftentimes the abuse is actually happening right now in a live setting,” he says.

Moran, who retired from An Garda Síochána in September, probably knows more about Child Sexual Abuse Material (CSAM), the preferred term for what in legislation is still known as child pornography, than anyone else in Ireland given his long years of experience investigating the crime.

During his time in the Garda, he spent years combating the growing issue of online child abuse with the Domestic and Sexual Violence Investigation Unit and the Garda Computer Crime Unit. He also spent 11 years seconded to Europol in France, where he was assistant director for the vulnerable communities sub-directorate.

READ MORE

Upon his retirement, he was immediately selected as the chief executive of Hotline.ie, an industry group which works with the Department of Justice, gardaí and international agencies to remove CSAM from the internet.

Hotline’s focus is on assessing reports of CSAM and having it removed, either by contacting service providers in Ireland or, in international cases, contacting that country’s CSAM reporting body. It also works closely with the Garda who take on the role of identifying victims and perpetrators.

There have always been people in our society with a sexual interest in children. The vast majority go through life without ever manifesting that except through fantasy

“For my analysts here, the joy they have in their hearts at being able to contribute to that [identification] work is part of what fundamentally drives them,” he says.

Finding the victim often means finding the abuser because, according to Moran, “statistically, the vast majority of child sexual abuse takes place within the family home or the immediate family circle. So that’s maybe Mammy or Daddy. Or it could be a brother. That’s a big one”.

But identifying victims and offenders is easier said than done. Abusers who share images online are increasingly aware that there is an army of skilled internet sleuths searching for any clues which might lead to their arrest.

Abusers not only blur the faces of their victims, they sometimes even obscure labels on clothing. But investigative methods, aided by artificial intelligence (AI), are improving all the time.

One suggestion which, if implemented, would be a game-changer for Ireland is the development of a national child abuse database, says Moran. This would serve as a repository for all CSAM gathered by Irish authorities which would be then available to gardaí and academic researchers.

Access would have to be highly regulated, but protocols can be put in place. Similar ones are already in place for Hotline.ie researchers, says Moran.

It would help “in hundreds of ways”, he says, including in victim identification. Material could be cross-referenced to find repeat victims and establish patterns, with AI playing a crucial role in the identification process.

“They’re doing great work on AI up in Technological University Dublin, but getting them access to see CSAM is difficult,” Moran says.

Take it away from the online vigilantes. Bring it in, control it properly, get a legal basis for it and protect the human rights of the individuals

This database could also be linked to other countries’ databases, increasing its effectiveness.

All of this is needed due to what Moran calls an “explosion in the availability of CSAM”. Hotline dealt with more than 29,000 cases of verified CSAM last year, twice the figure for 2022 and 10 times more than 2020.

The problem is not that there are more paedophiles: it is that paedophiles have increasingly easy access to CSAM.

“There have always been people in our society with a sexual interest in children. The vast majority go through life without ever manifesting that except through fantasy,” says Moran. “But unfortunately, the internet now has resulted in it being much easier in the modern world for someone to manifest their sexual interest in children.”

The figures are also increasing because Hotline is being more proactive. Moran says he will soon seek permission from the Garda and the Department of Justice to go after material on the Dark Web, a computer network with restricted access that is primarily used for illegal file-sharing.

“If we do get that permission, then of course our numbers are going to go up,” he says.

For my analysts here, the joy they have in their hearts at being able to contribute to that [identification] work is part of what fundamentally drives them

Moran would also like to see a dedicated Irish unit tasked with launching undercover operations to catch online child predators. The idea is that trained gardaí would pose as underage children online with the aim of attracting and identifying potential abusers.

This activity is already carried out by other police forces and the Garda has some limited ability in this area. However, in many cases, it relies on agencies like the FBI to pass on information about Irish child abusers.

An undercover unit could be staffed by volunteer, reserve gardaí who would be set up in a “controlled environment to find Irish men who are willing to come and meet 12 year old girls to have sex”, says Moran.

In recent years, cyber-vigilante groups have engaged in similar activity. In some cases, this has led to prosecutions, but in others it has resulted in assaults being committed or innocent people being wrongly accused.

“Take it away from the online vigilantes,” says Moran. “Bring it in, control it properly, get a legal basis for it and protect the human rights of the individuals. Because even if they’re offenders, they still have human rights.”