Children have a right to participate in the online world and to be protected from harm when doing it. They have a right to true and accurate information, and not to be exploited for commercial reasons. That’s what the UN Convention of the Rights of the Child tells us – but as we all know these worthy goals are entirely disconnected from current reality.
This disconnect manifests itself in the disturbing global increase in mental health concerns for children and young people over the past 15 years. CyberSafeKids’ trends and usage report, which is published today, shows more than 25 per cent of children under 12 are troubled by what they come across online, and a further 25 per cent encounter cyberbullying (this rises to 38 per cent of 12- to 14-year-olds). The majority of children (57 per cent) are telling us that their online experience is broadly not a positive one.
The playgrounds of this discontent are all the big tech platforms you will be familiar with, although YouTube and Roblox, which advertises itself as the “imagination platform” and hosts a huge collection of games, stand out as the most likely places for young children to encounter harmful content, whereas Snapchat and TikTok are the worst purveyors of such content for older kids.
This data is deeply troubling for an organisation such as ours, CyberSafeKids, which is working to close the significant education gap for children, parent-guardians and educators about how to be safe online. We have achieved a huge amount in nine years, but it’s been like bailing a boat with a soup ladle while big tech is filling it with a fire hose.
Dublin school to review ethos statement over message telling parents students must attend all religious ceremonies
Iceland in a camper van, with a toddler: the first problem was getting the rental company to part with a vehicle
‘I woke up one morning to 2,000 comments. I’ve had death threats’: How schools deal with social media fallout
‘Trust me, I’m a family Christmas expert and these are the rules to live by’
One thing we have realised is that while education remains a key strategy in better equipping children to be safe online, we need to do more to help parents and educators to support them. Parents have such a fundamental role and yet many don’t feel equipped to fulfil that role in a meaningful way. We urgently need to change that.
The other critical component of any solution is regulation. The tech companies behind the myriad services that Irish children use in huge numbers – Google, Meta, TikTok, Roblox and other gaming companies such as Epic Games, the makers of Fortnite – have a responsibility to ensure that children will be safe on their platforms.
[ Millions of children use Roblox – but this digital playground has a dark sideOpens in new window ]
I’ve been told not to be naive; that tech companies will always put their commercial interests ahead of the safety of their users and only change when governments hold them to account. There is truth in that. So regulation needs to really hit the bottom line if the companies continue to fail to uphold their duty of care in relation to children.
These companies are profiting enormously from children’s use of their services – to the tune of billions of dollars each year. There has to be some quid pro quo in terms of their safety and meaningful efforts to protect children. The current, heavily advertised, safeguarding efforts that companies are making fall woefully short, and we should be deeply suspicious of their claims to be able to self-regulate.
To make an analogy with the offline world, consider public spaces in which children gather – playgrounds, parks, schools. There are specific rules that apply to those spaces to ensure they are not just suitable for children, but that children can thrive in those spaces. Some will fall below the expected standard, but there are mechanisms in place to correct failings and enforce responsible ownership.
For far too long, there has been a complete lack of accountability in online spaces in relation to users in general, never mind children. Yes, that is changing in Ireland with the Online Safety and Media Regulation Act (OSMRA) and the Digital Services Act (DSA) at the European level, but concerning gaps remain.
We need to ensure that “safety by design” is the default obligation for any online service that children use. Not just the overall “very large online platforms”, which fall under the DSA, but for any service that is used by children. One gap is gaming platforms that are enormously popular with children – the likes of Roblox or Fortnite. The regulation of these is unclear since they fall below the size threshold of the DSA and are not headquartered in Ireland, so do not fall under the remit of the OSMRA either.
[ Keeping children safe in the online world of social media platformsOpens in new window ]
It’s time for child-specific online safety legislation, because the OSMRA does not go far enough in relation to child users. What is needed is legislation that defines a “safe online environment” for children and outlines features that do not meet that threshold, as well as setting out the age at which a young person can access those features, with robust age-verification mechanisms in place to protect younger users from them. The law needs to strengthen the powers of Coimisiún na Meán so that it can properly hold these companies to account.
There is no one silver bullet solution, but there is clearly so much more that could be done. I worry for my own kids that these changes won’t come soon enough. But I remain optimistic that realisation is dawning on us as citizens, and our politicians as leaders, that we cannot delay action any longer.
Alex Cooney is chief executive of CyberSafeKids, whose Trends & Usage 2024 Report is published today
- Listen to our Inside Politics Podcast for the latest analysis and chat
- Sign up for push alerts and have the best news, analysis and comment delivered directly to your phone
- Find The Irish Times on WhatsApp and stay up to date