Nearly a quarter of teenage girls have come across harmful information online promoting weight loss and eating disorders such as anorexia or bulimia, according to new research.
Some 23 per cent of 13- to 17-year-old girls, and 12 per cent of boys of the same age, had come across “thinspiration” content online in the last year, promoting ways to be very thin, according to a major study into young peoples’ behaviour online.
One in ten teenagers had been exposed to information or discussions about different ways to die by suicide, while 18 per cent were shown imagery or content about self-harm, the research said.
The research, published on Monday, was based on interviews carried out by Ipsos-MRBI, between December 2019 and October 2020. Researchers interviewed 765 children between nine and 17 years of age in their homes, as well as 765 parents, and then 387 other adults.
The national online safety survey of both children and parents was produced by the State’s National Advisory Council for Online Safety.
More than a quarter of 13- to 17-year-olds surveyed said they had seen gory or violent images online of people hurting other people or animals.
Overall 11 per cent of young people reported coming across content promoting ways to be very thin, while nine per cent had encountered information on ways to die by suicide.
Negative experience
One in five young people reported having a negative experience online in the past year, with girls twice as likely to have had bad experiences.
Some 26 per cent of nine- to 10-year-olds had a profile on social media or online gaming sites, which increased to 87 per cent for 15- to 17-year-olds.
The most popular online app among those surveyed was YouTube, followed by Snapchat, Instagram and then Facebook.
Eight per cent of young people had received sexual messages online. This ranged from one per cent of those aged 11-12, up to 15 per cent of 15- to 17-year-olds. A further six per cent of young people said they had been sent unwanted requests for sexual information.
The study found parents significantly underestimated how likely it was their children had been exposed to harmful content online.
While four per cent of parents said their child had been exposed to information about self-harm, 13 per cent of children said they had seen such content.
Similarly, while six per cent of parents said their child had come across hate speech on the internet, 20 per cent of children said they had when asked.
Eighteen per cent of 13- to 14-year-olds said they had been cyberbullied in the past 12 months. About four in 10 girls who had been cyberbullied said they were targeted on social media, or through messages sent to their phones.
Worries
The study found parents biggest worries about their child being online was that they could be exposed to pornography, racist and hateful messages, or that they would be contacted by an online predator. Half of parents said they were concerned their child might be asked to send sexual images while online.
While many social media companies set a minimum age requirement for users at 13, the survey found significant underage use of the platforms. Some 18 per cent of nine- to 11-year-olds used TikTok, while 17 per cent used Snapchat.
Speaking at the launch of the research, Minister for Culture and Media Catherine Martin said she would "urgently" progress the Online Safety and Media Regulation Bill, which included provisions to tackle harmful online content.
When in force the legislation would mean “the era of self-regulation will be over” for social media companies, Ms Martin said.