EU to unveil landmark law to force Big Tech to police illegal content

Tech platforms will have to do risk assessment and risk mitigation to protect users

The EU is poised to unveil a landmark law on Friday that will force Big Tech to police their platforms more aggressively over illegal content. Photograph:  Lionel Bonaventure/AFP via Getty Images
The EU is poised to unveil a landmark law on Friday that will force Big Tech to police their platforms more aggressively over illegal content. Photograph: Lionel Bonaventure/AFP via Getty Images

The EU is poised to unveil a landmark law on Friday that will force Big Tech to police their platforms more aggressively over illegal content, marking the latest move by regulators to curb the power of large technology groups.

The controversial practice of targeting users online based on their religion, gender or sexual preferences will be banned under the Digital Services Act (DSA), according to four people with knowledge of the discussions.

The DSA is a legislative package that for the first time sets the rules on how Big Tech should keep users safe online. It comes a month after the EU passed the Digital Markets Act, as it pushes ahead with the biggest overhaul of the laws governing the world's biggest technology companies in more than two decades.

Under the DSA, manipulative techniques that lead people to unwillingly click on content on the internet, known as dark patterns, will also face a ban.

READ SOME MORE

Margrethe Vestager, the EU’s executive vice-president in charge of digital policy, said she was hopeful of a breakthrough on Friday. She added the DSA would enable regulators to act so that users could be “safe online, buy products and express oneself”.

Child safeguards

As part of the deal, which will be agreed in Brussels between member states, the European Commission and the European Parliament, children will be subject to new safeguards, meaning online platforms such as YouTube or TikTok will need to explain their terms and conditions in a way a minor can understand.

Companies such as Facebook parent Meta will not be able to target minors with advertising under the new rules.

“The DSA shows that online platforms cannot do whatever they like and that they do not unilaterally set the terms of what users can or cannot see,” said an EU official working on the legislation.

Regulators will also include an emergency mechanism to force platforms to disclose what steps they are taking to tackle misinformation or propaganda in light of Covid-19 and the war in Ukraine.

Medium-sized platforms are likely to be given a grace period until they are able to fully comply with the new rules while large companies such as Google and Amazon will have to comply once the rules are enacted.

Big Tech will fund the supervisory fees to make sure they are being compliant with their obligations of policing the internet, two people with direct knowledge of the matter said.

Large platforms, defined as having at least 45 million users in the bloc, will foot a yearly bill of between €20 million and €30 million, the people added. Those companies that break the rules will face fines of up to 6 per cent of global turnover.

‘Too big to care’

Search engines will also be captured by the new rule book, meaning companies such as Google will have to assess and mitigate risks when it comes to users spreading disinformation on its search platform.

Thierry Breton, the EU's internal market commissioner, has warned Big Tech has become "too big to care".

While regulators expect a deal to be struck on Friday, some warned the timing could slip and the final agreement could change at the last minute.

There remain tensions within the European Parliament between greens who want more privacy protections and liberals who are defending business-friendly regulations.

But Christel Schaldemose, an MEP leading the debate on the DSA, told the Financial Times there was "momentum now" and "[this is] the best time for a deal".

“We need to have better rules and better protection for users. The DSA will make the platforms responsible for their algorithms, they have to do risk assessment and risk mitigation, to protect us,” she said. – Copyright The Financial Times Limited 2022