EU probes Meta over fears of child addiction to social media

Commissioner ‘not convinced’ online giant has done enough to mitigate risks to physical and mental health of young people from Facebook and Instagram

New EU Meta probe into issues of child protections is probe is second into the company under the EU’s Digital Services Act. Photograph: iStock
New EU Meta probe into issues of child protections is probe is second into the company under the EU’s Digital Services Act. Photograph: iStock

Brussels has opened an in-depth probe into Meta over concerns it is failing to do enough to protect children from becoming addicted to social media platforms such as Facebook and Instagram.

The European Commission, the EU’s executive arm, said it would look into whether the Silicon Valley giant’s apps were reinforcing “rabbit hole” effects, where users get drawn ever deeper into online feeds and topics.

EU investigators will look at whether Meta is complying with legal obligations to provide appropriate age-verification tools to prevent children from accessing inappropriate content.

The probe is the second into the company under the EU’s Digital Services Act. The landmark legislation is designed to police content online, with sweeping new rules on the protection of minors.

READ SOME MORE

It also has mechanisms to force internet platforms to reveal how they are tackling misinformation and propaganda.

The DSA, which was approved last year, imposes new obligations on very large online platforms with more than 45 million users in the EU. If Meta is found to have broken the law, Brussels can impose fines of up to 6 per cent of a company’s global annual turnover.

Repeat offenders can even face bans in the single market as an extreme measure to enforce the rules.

Thierry Breton, commissioner for internal market, said the EU was “not convinced” that Meta “has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram”.

“We are sparing no effort to protect our children,” Mr Breton added.

Meta said: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”

In the investigation, the commission said it would focus on whether Meta’s platforms were putting in place “appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors”. It added that it was placing special emphasis on default privacy settings for children.

Last month, the EU opened the first probe into Meta under the DSA over worries the social media giant is not properly curbing disinformation from Russia and other countries.

‘TikTok keeps me up all night’: Ireland’s teenagers on their relationship with the social media appOpens in new window ]

Brussels is especially concerned whether the social media company’s platforms are properly moderating content from Russian sources that may try to destabilise upcoming elections across Europe.

Meta defended its moderating practices and said it had appropriate systems in place to stop the spread of disinformation on its platforms. – Financial Times