Online abuse: ‘State-run, independent complaints mechanism needed’

Social media companies are failing to tackle online abuse, Oireachtas committee is told

Social media companies’ complaint mechanisms are often unsatisfactory ‘because they do not allow context’
Social media companies’ complaint mechanisms are often unsatisfactory ‘because they do not allow context’

Social media companies are failing to tackle online abuse, and a State-run, independent complaints mechanism is needed, an Oireachtas committee has been told.

The Joint Committee on Tourism, Culture, Arts, Sport and Media met today to conduct pre-legislative scrutiny of the General Scheme of the Online Safety and Media Regulation Bill.

The Bill seeks to establish a new Media Commission, including a dedicated Online Safety Commissioner.

Campaigners say the Bill should include an independent mechanism where people can lodge a complaint about a social media platform that fails to take down harmful content.

READ SOME MORE

Alicia O’Sullivan (19), from Cork, spoke to the committee about her experience of image-based sexual abuse.

In April, she discovered that a fake Instagram account was set up in her name. The perpetrator posted multiple explicit images of another woman’s naked body.

“The horror experienced when realising what was unfolding was amplified by the uninformed, dismissive and victim-blaming reasons received from An Garda Síochána,” she said.

“Additionally, the initial reluctance by Instagram to remove this account, whilst simultaneously deleting my personal account further aggravated the situation.”

Ms O’Sullivan wants gardaí to receive training in how to handle these crimes, as well as a public-awareness campaign to highlight the relevant law, Coco’s Law. “I was told that someone posting illicit photographs purporting to be me was not illegal, when in fact it was and is.”

She claimed that when she reported the incident to gardaí, she was not told about the Cork Sexual Violence Centre or that the Divisional Protective Services Unit could help her.

Ms O’Sullivan added that a specific helpline or support network for victims should be created, and that social media platforms should be taking this type of abuse more seriously.

Prof Louise Crowley from UCC’s school of law urged politicians to start holding social media giants to account.

“A helpline needs to work side by side with a body that can do something about it . . . the obligation is on our law makers to take action to prevent, to make these platforms accountable for what they are doing. It is not enough to let them make their own rules.”

The committee also heard from Australia’s eSafety Commission; the world’s first government agency dedicated to keeping citizens safe online.

Through legal regulatory schemes, Australians can lodge a complaint with their eSafety Commission if a social media company fails to take down content that they deem to be harmful.

The government agency will then investigate the complaint, make a report, and contact the victim to gather more evidence.

The eSafety Commission can then ask the poster, website or hosting network to take the content down, even if the content is hosted outside the State.

The commission can also go through the courts to get a legally binding order for content to be removed.

“Here in Australia during the pandemic, we’ve seen a surge in reports across all of our regulatory schemes,” said Commissioner Julie Inman Grant.

In the fourth quarter of 2020, compared to 2019, reports about illegal online content increased by 96 per cent.

Reports of image-based abuse increased by 255 per cent, child cyberbullying increased by 19 per cent and adult cyberabuse increased by 53 per cent.

The system has been effective so far, with Ms Grant stating it has an 85 per cent success rate in getting intimate images removed.

Toby Dagg, executive manager of investigations with the comission, said media companies’ complaint mechanisms are often unsatisfactory because they do not allow context.

The eSafety Commission can gather cultural, social and individual factors that impact on the abuse, and can report this to the social media platforms.

Ms Grant added Ireland should appoint a commissioner with knowledge of the tech industry.

Proper mental health and wellness supports will also be required for staff who deal with complaints as they will be exposed to horrific content, she said.