We need to think hard about how to regulate the internet

Net Results: Regulation and legislation vital when we do not own our data as default

Cambridge Analytica: yet another undercover expose about data misuse. Photograph: Andy  Rain
Cambridge Analytica: yet another undercover expose about data misuse. Photograph: Andy Rain

At the core of the whole Cambridge Analytica and Facebook scandal is this: unforeseen consequences.

The unforeseen consequences of developers creating a huge platform fuelled by the gathering of personal data. The unforeseen consequences of deciding the perfect business model is to collect as much as possible and monetise it later.

The unforeseen consequences of allowing third parties to siphon off data not just of individuals who give their consent, but of those individuals’ unknowing friends and contacts. The unforeseen consequences of mediocre and poorly enforced privacy policies. The unforeseen consequences of making available massive data sets comprising people’s identifiable personal information to researchers or third parties, without auditability.

The unforeseen consequences of having no oversight for third parties, to see what happens next with your users’ data. The unforeseen consequences of seemingly, not caring that much anyway.

READ MORE

The unforeseen consequences of offering highly targeted ads to niche audiences, cheaply. The unforeseen consequences of allowing these to be ‘dark ads’, only seen by the intended audience, hiding content from others who might dispute claims and hold advertisers to account.

The unforeseen consequences of Irish governments underfunding the national data regulator for years, yet encouraging the digital information-based companies to locate in the state. The unforeseen consequences of a regulator working out an oversight arrangement for Facebook, but, failing to fully audit or enforce that agreement’s provisions (perhaps, related to the first point).

Cambridge Analytica

The unforeseen consequences of Facebook learning of the misuse of 50 million personal profiles by Cambridge Analytica, and then taking two years to bar company access to Facebook’s platform.

Technology history is littered with unforeseen consequences, of course. The creator of the world wide web, Sir Tim Berners-Lee, noted his own reluctant acknowledgement of this in relation to his invention on Channel 4 on Tuesday. As the news team aired yet another undercover expose of Cambridge Analytica, he said he’d changed his mind about the web needing to be completely open, unfettered and unregulated.

“We need to rethink our attitude to the web,” he said. We need to rethink a lot of things: our willingness to allow companies to continue to use our data as their product, for a start. Think you’re just playing a fun, free game or quiz online? Buying a book on the web? Running a health app? Using a DNA kit to see how Neanderthal you are? You’re also handing over valuable personal information that is regularly exploited internally or sold on as the company’s real, far more lucrative product.

Why do we not own our data as a default? Why does, say, 23andme not pay me for their DNA analysis, given that they make millions off those DNA profiles when sold to pharma researchers? Just because whole business sectors have often surreptitious income models predicated on our (free) data doesn’t mean we have to leave it that way.

But we also need to think – hard – about regulation and legislation. Knee-jerk proposals – like removing all net anonymity – are popular with legislators but misguided and actually would make Analytica-style exploitation of data even easier by tying everything we do online to an individual.

And anyway, ahem, Facebook’s major problem with an estimated 250 million fake or duplicate accounts proves how hard it is to make identification mandatory.

Protection fors individuals

By contrast, Europe's incoming General Data Protection Regulation (GDPR) is, on the whole, a balanced and thoughtful piece of legislation that will offer far more protection to individuals – if still not the badly needed impetus to really rethink the web and its business models.

Still, it's an important start. But as we implement it in Ireland via a current Bill, legislators must consider the unforeseen consequences of some Irish-only provisions. Some clearly conflict with the actual GDPR and recent European Court of Justice rulings on data protection and privacy, and could produce unwanted outcomes.

Primary among these is section 43 of the Bill, which – counter to the actual GDPR – would allow collection of political information about individuals by political parties, candidates and politicians, or a body established by an enactment.

Even though the Bill has been amended this week, in the wake of the Cambridge Analytica revelations, to limit processing of such data to Irish entities only, for “electoral activities”, this section still opens the door to Analytica-type use of data.

Not only is the section not allowed under the provisions of the GDPR– even stretched to its conceptual limits – but lawyers say it leaves key terms like “electoral activities” undefined and could allow an individual to, say, run for a council position to process data from abroad.

"Instead of taking a clear, simple, minimalist approach of repealing old legislation and replacing it with the GDPR, the government has treated this regulation [which must be enacted, but allows for some local tweaks] like a Directive [which can be interpreted differently by each member state] and have been trying to override parts of the regulation, creating uncertainty," says solicitor Simon McGarr, director of Data Compliance Europe.

Creating uncertainty. Unforeseen consequences.