Hosting companies back in spotlight after US latest mass killings

Net Results: We are wrong to make internet corporations society’s de facto censors

Makeshift memorial for the victims of the mass shooting at a Walmart in El Paso, Texas: law enforcement doesn’t necessarily wish to have platforms or social media sites excluding all offensive content or users. Photograph: Larry W Smith
Makeshift memorial for the victims of the mass shooting at a Walmart in El Paso, Texas: law enforcement doesn’t necessarily wish to have platforms or social media sites excluding all offensive content or users. Photograph: Larry W Smith

US authorities announced that the suspect in last weekend’s mass shooting in El Paso was believed to have posted a white supremacist diatribe to 8chan, once again spotlighting the extremist forum.

Although Jim Watkins, the chief executive of 8chan, denied in a YouTube video that the shooter had made such a post, the site has come under scrutiny before for alleged links to acts of domestic terrorism. The suspect in the New Zealand mass shooting that left 51 Muslims dead, and in the Poway, California, synagogue shooting where one person died and others were injured, have both been linked with 8chan.

The consequent decision under pressure by 8chan's hosting company, Cloudflare, to remove the platform this week has highlighted one of the most complex challenges facing democracies, businesses, lawmakers and citizens.

What are the limits – if any – of a personal right to freedom of speech online? How are such rights, or their limitations, protected and defended on a global internet that mostly ignores geographic boundaries and national laws? Who decides? And who is the ultimate censor? Such concerns are further agitated by a widespread but wrong international assumption of a blanket, legal “right to free speech” that is actually a US first amendment constitutional right.

READ MORE

Society is more cognisant of some of these issues – or rather, has taken various sides in debates about the surface problems, without thinking through the underlying issues – when such questions involve content-hosting platforms such as Facebook, YouTube and Twitter. As ever-greater numbers of people post content seen as offensive, violent or harmful, others – activist groups, individuals, politicians, law enforcement – have demanded platforms take action to find and delete that content.

Political extremists

The platforms themselves have been forced, albeit reluctantly, to recognise the seriousness of the problem when explicit videos, images and screeds are posted by political extremists, criminals and fraudsters at one end of the spectrum, and at the other end (which can often be more immediately and personally distressing) by abusers, exploiters and bullies.

Platforms have been evolving policies about such content and activity, despite arguing they should not be responsible for what members post because they are “just” platforms. They are neutral content hosters, they say, and therefore are not responsible for content in the way that, say, a media website or newspaper is, or constrained by libel and defamation laws and restrictions on what is considered publishable content. The fact that their own platform algorithms allow and even highlight extremist content to some users has weakened this defence.

Cloudflare falls into a different category – or that is the line taken by its chief executive, Matthew Prince, who first pledged the company would not remove 8chan, but days later did.

Cloudflare is a massive cloud space that hosts 19 million websites, including 10 per cent of Fortune Top 1000 companies. It offers ultra-secure hosting and inbuilt defences against the kind of mass-scale denial-of-service internet attacks that aim to take websites down – whether the target be a multinational corporation or an extremist discussion board.

In a blog post this week, he noted the company did not want to be an arbiter of content. He also reminded readers that, as a private company, Cloudflare is "not bound by the first amendment".

Political solutions

He wrote: “Cloudflare is not a government. While we’ve been successful as a company, that does not give us the political legitimacy to make determinations on what content is good and bad. Nor should it. Questions around content are real societal issues that need politically legitimate solutions.”

Interestingly, he pointed to European governments taking “a lead” by imposing restrictions on the content platforms but – conveniently for Cloudflare – not acting similarly against pure hosting platforms.

However, many legal and civil rights advocates share his view. In 2017, in another Cloudflare case involving extremist content, lawyer Kate Klonick argued in the New York Times that such demands turn powerful companies into censors, especially as Cloudflare has little competition. "We have no idea what site [such companies might] take down next," she argued.

There’s also an indication that law enforcement doesn’t necessarily wish to have platforms or social media sites excluding all offensive content or users.

"Content removal doesn't necessarily cause extremists to go away, it causes them to go somewhere else. That may have negative consequences. It may have a disruptive effect, but it doesn't solve the problem," Prof Peter Neumann of Kings College London said at Berlin's digital festival Re:publica in 2017. He said extremists became harder to monitor and track if forced on to hidden, encrypted platforms

These are all difficult considerations. People have every right to pressure private companies over their actions and approaches to business. But turning corporations into society’s de facto censors is wrong. These complexities are global issues that need to be debated, and acted upon, at a thoughtful, global level.