Google, Facebook and other social networks are deciding for us just what it is they think we’re looking for, according to the author of a book on the ‘filter bubble’
ELI PARISER’S eureka moment arrived when he realised his conservative friends were disappearing from Facebook. They weren’t leaving the social networking site, but he would see them online less often. Their links would rarely turn up in his “top news” feed, and he figured out that because he was clicking on their pages and links less often than he would with his left-leaning friends, Facebook was “doing the math” and deciding what to show him and what to hide. After doing research, he realised Facebook wasn’t alone.
It's this anecdote that kicks off The Filter Bubble: What the Internet is Hiding from You,written by Pariser – a 30-year-old former executive director of MoveOn.org – on how the internet is personalising and tailoring the information we consume.
It’s an action that Pariser believes has ugly consequences. Explaining what the filter bubble is, he writes: “The basic code at the heart of the new internet is pretty simple. The new generation of internet filters looks at the things you seem to like – the actual things you’ve done, or the things people like you like – and tries to extrapolate. They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next. Together, these engines create a unique universe of information for each of us – what I’ve come to call a filter bubble – which fundamentally alters the way we encounter ideas and information.”
So when you type something into Google, you’re not getting a generic response from the search engine, but a tailored collection of links and news that Google decides you want. Therefore, each individual’s search could be different depending on their history.
Speaking from London, Pariser explains that during his research, he discovered these dynamics are now built into the internet – invisible to most of us yet hugely influential on the information we are presented with, and the opinions we form based on that.
“I was familiar with ads and targeted ads and targeted products, but what was much more weird and fascinating was this whole world of targeted content, targeted ideas,” Pariser says. “That seems to be the direction things are moving in very quickly. News sites are looking at it to promote news stories, search engines are building it in. It’s quickly all shifting towards that.”
The implications of such a filter range from subtle to seismic. Pariser talks about how we consume media offline, where we make specific decisions regarding what media we consume. We know that if we listen to the same issue being discussed on the Adrian Kennedy Phone Showon FM104 or BBC's Newsnight, we're going to get two levels of debate, but that's our own decision to make. However, online, what should be conscious decisions are now being made for us. If you're constantly searching anti-war articles, the content you get when you Google "Iraq invasion" is going to be different to someone who spends their time reading neo-conservative blogs.
“One of the problems I have with how Google does personalisation is that you don’t know, or most people aren’t aware that it’s happening,” Pariser says. “Essentially you’re getting a distorted view of the world – you’re surfing the web with blinders on.”
He believes we can’t underestimate the impact of this. “The small adjustments to how certain ideas are seen or solicited can have huge impact. Just the shift between the first result and second result on Google means that you’ll lose half the people. In a medium where we’re constantly searching, constantly connecting, that’s big . . . The biggest danger is that there is a whole set of issues and topics that we’re likely to lose sight of completely.
“ On Facebook, it’s very hard to click ‘like’ on a story about homelessness or the war in Afghanistan, so those things drop out of view. We don’t even know that they’re happening, that’s the biggest danger. On an individual level, there’s the feeling like you have a 360 [degree] view of the situation, but you’re actually stuck in your parochial view.”
The filter bubble is the consequence of commercial decisions by companies. Facebook, Google and others need our data to make money from advertising, yet don’t seem to take into account the wider consequences of the methods they adopt to get that data, thus increasing the likelihood of you clicking a link – once they’ve used your personal data to present you with links they think you’re highly likely to click.
Pariser believes we’ve been slow on the uptake with how things are changing because the way in which information and data is processed online is too complicated. “We haven’t got our heads around how information travels, or the physics of it. Offline, if you tell one person something about yourself, a limited number of people will find that out. Conversely, online, you can share something with any number of people in milliseconds. You can click on a link in one place and have that data sold to another website – without any acknowledgement – in milliseconds.
“So that, I think, is part of the problem. Companies don’t do anything to show that part of the process. It’s not intuitive for most people and they [companies] help that along because they sort of describe these services as free, but they’re definitely not free. The data we hand over to them is convertible to them directly into money . . . it would feel very different if Google was presented as a $100-a-year service you paid for in your data.”
This tailoring of information has other implications. Pariser uses a quote from Facebook’s chief operating officer, Sheryl Sandberg, to illustrate what he calls the “You-Loop”.
“People don’t want something targeted to the whole world – they want something that reflects what they want to see and know.”
Pariser sees this as opposite to what the internet once was. “It’s hard to imagine a more dramatic departure from the early days of the internet,” he writes, “in which not exposing your identity was part of the appeal.”
Pariser's book acts as a warning about how the internet is impacting on our lives. It's a natural successor to Nicholas Carr's The Shallows: What the Internet is Doing to our Brains. And its conclusions are unsettling. As Danny Sullivan, editor of the Search Engine Land blog, tells Pariser, "I don't think the genie goes back in the bottle."
What if he's wrong?
PARISER’S THOUGHTS are not without criticism. Here are some of the most common arguments against his “filter bubble” thesis.
What if the filter bubble is a good thing?
Some might argue that the filter bubble is a consequence of a relentless and overwhelming amount of information online.
Perhaps if information was not beginning to be filtered to our demands and interests, we would have quite a hard time online accessing the relevant information that we want.
Companies like Google and Facebook are not overtly trying to censor.
They’re trying to personlise web searches for two reasons – so that information is more tailored to the user; and so that the tailored information results in more click throughs. While the actions of Google and Facebook and others are not implicitly bad, the consequences could be perceived to be.
Facebook is being used as a media outlet when ultimately it does not intend to be that.
Pariser says there is a danger certain news stories will disappear from view because of how Facebook has tailored interaction with them, for example the difficulty with clicking the “like” button on a story about the war in Afghanistan.
But although Facebook would like to be an all-encompassing, one-stop shop for an individual’s media, it is still essentially a social network and perhaps should not be used as a primary news source. Pariser’s methods show how the filter bubble allows for different search results.
In a TED talk in 2010, Pariser pointed out how searches are personalised by using the example of two people searching for the term “BP”.
By showing that different searches by different people yield different results, Pariser contended that Google personalised results using 57 signals about an individual user.
Pariser’s critics suggested that his methods were vague. But it is difficult to succinctly analyse search results considering every individual and every individual’s computer has a long train of history, information and behaviour that could yield a vastly different – or similar – quantity and type of results.
How the 'filter bubble' works
According to Pariser, the bubble works like this: "Most personalised filters are based on a three-step model. First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media.
"Eric Schmidt [Google] likes to point out that if you recorded all human communication from the dawn of time to 2003, it would take up about five billion gigabytes of storage space. Now we're creating that much data every two days."
As for bursting the bubble: "Someone who shows interest in opera, comic books, South African politics and Tom Cruise is harder to pigeonhole than someone who just shows interest in one of those things."