Net Results: Dangerous call for ‘back doors’ to encryption

EU set for data access exposing everyone to pernicious risk of information breaches

Back doors create an exploitable access point to data on the devices and networks most organisations and individuals use. And they increase opportunities for spying and hacking.
Back doors create an exploitable access point to data on the devices and networks most organisations and individuals use. And they increase opportunities for spying and hacking.

Loony notions about encryption are back on the discussion table in the EU (and Ireland). This does not bode well for any of us.

Encryption is the encoding of information so that it cannot be accessed by anyone other than the intended recipient. It’s the cornerstone of safe online data transfer and the secure storage of data in computers.

Encryption underlies the purchases you make on the web, scrambling your credit card information so that it remains secure. It is fundamental to the transfer of data between businesses, in contracts and forms, both customer and partner data.

It underlies global finance systems. Encryption is also a key part of many popular messaging services. WhatsApp automatically encrypts conversations, as does Signal and Telegram.

READ MORE

Law enforcement and surveillance agencies do not like digital encryption. They haven’t liked it for a very long time. Encryption was a huge political issue in the US when I first began writing about technology for The Irish Times, which is going on 25 years now.

At that time, the US government had been battling to keep encryption out of the hands of individuals and the average business. It also blocked the exportation of so-called “strong encryption” products – those that used the most unbreakable algorithms to protect data – to countries outside the US.

Criminals and terrorists

The problem with encryption, from law enforcement’s point of view, is that criminals and terrorists can also use encrypted communications that are nearly impossible to crack.

The solution, as generally proposed by law enforcement, is to disallow or disable encryption, either banning specific encryption products, or requiring the creators of such products to supply a “back door” – which means deliberately coding in a vulnerability to supply a way for law enforcement to access encrypted information.

Garda Commissioner Drew Harris has increasingly been floating an argument for back doors. Noting the growing use of encrypted phones by gangs, he said a few times over this year that gardaí need (and expect) legislation that will supply a mysterious "electronic key" to break encrypted devices.

This past week, a new draft document from the EU Council of Ministers faffs about with a similarly vague proposal, a draft council resolution on encryption.

While first offering the equivalent of those website “we respect your privacy” consent notifications that no one believes, the draft resolution wends its way towards stating it will propose back doors (“technical and operational solutions”).

It states: “Competent authorities must be able to access data in a lawful and targeted manner, in full respect of fundamental rights and the data protection regime, while upholding cybersecurity. Technical solutions for gaining access to encrypted data must comply with the principles of legality, transparency, necessity and proportionality. Since there is no single way of achieving the set goals, governments, industry, research and academia need to work together to strategically create this balance.”

Security experts

The terribly inconvenient problem here is that there is no way – zero, zilch, nada – of doing so without putting everyone’s information, everyone’s data, at risk, including the entire global business and financial network, and the secret communications of law enforcement and security agencies themselves.

There is no technical way a back door can be implemented without also creating these risks. This has been said endlessly, over three decades, by computing and security experts.

But hey, don't take it from me. Since 2014, Michael Hayden, the former head of the US National Security Agency (NSA) and the CIA, has argued (from, let's just say, his well-informed position) that banning encryption companies and products or requiring back doors creates far worse security scenarios. In 2019, he wrote in an opinion piece for Bloomberg: "Law-enforcement agencies advocate for 'extraordinary access' to encrypted data to aid investigations . . . [but] allowing those agencies extraordinary access would needlessly increase the vulnerability of public and private actors to cyberattacks, without sufficiently addressing law enforcement's needs."

In short, he argues that bans and back doors only push users to other equivalent, easily-available products. Back doors also create an exploitable access point to data on the devices and networks most organisations and individuals use. And they increase opportunities for spying and hacking: “We must also consider how foreign governments could master and exploit built-in encryption vulnerabilities.”

A big fat irony in the EU making these proposals is that Privacy Shield and its predecessor, Safe Harbour, were invalidated by the Court of Justice of the EU in the two Schrems cases because of concern that US surveillance agencies can secretly gather and access the information of EU citizens, under US laws. That concern shapes, and is embedded in, the General Data Protection Regulation (GDPR).

But now, the EU (and Ireland) appear ready to promote clumsy forms of data access that would expose every citizen, business and government to far greater and consequential data exposure risks than ending up in an a random NSA mass data haul.

The EU, and Ireland, need to recognise the idiocy and the hypocrisy of taking such a step.