A new study from Facebook is keen to highlight how small an effect the News Feed algorithm has on the diversity of content users are exposed to but the data suggests that the social network may be placing us in a political echo chamber.
Published in the journal Science, the study examines the content consumption of more than 10 million Facebook users who volunteer their political stance by labelling themselves as liberal or conservative.
The findings show the algorithm strips out one in 20 hard news stories posted by your liberal buddies if you're conservative-minded, while one in 13 conservative stories are omitted from the News Feed of a self-identified liberal, creating a "filter bubble" where you are more likely to see things you agree with. Manipulation Facebook users should be wary of the fact that this algorithm presents you with content that is related to your ideological standpoint and removes content from sources you are less likely to agree with, says John Breslin, senior lecturer in Electronic Engineering at NUI Galway and technology start-up advisor.
“We don’t actually know what else these algorithms are selecting your stories based on, or how widespread their effects are. For example, we had the experiment carried out by Facebook in 2012 and published last year where they manipulated the display of happy and sad stories to 150,000 users to see if they would in turn share happy or sad content.
“I hope that was an isolated test, but it did cause me to stop using my own Facebook personal profile,” he says.
“The problem is that social networks are effectively increasing political polarisation, and this is being accelerated by algorithms such as the one from Facebook that curates your news feed for you: such news selectivity is generally accepted as being counter to democracy.”
"My view would be that this kind of conversation, while negative, is in Facebook's interest: they should be pushing the debate about where the news comes from," says Maryrose Lyons, digital marketing consultant and CEO of Brightspark Consulting.
“The majority of social media and search engine users don’t even think about where their content comes from or that algorithms are involved in the process.”
Facebook could also see this as a commercial opportunity where advertising is not the only revenue model, explains Lyons: “They could charge for a premium service that includes the ability for users to tweak the algorithm and set their own filters.” Ranking and filtering
Defending the News Feed algorithm, Facebook engineer Lars Backstrom has previously stated that users would "miss something they wanted to see if we displayed a continuous, unranked stream of information".
This ranking and filtering of politically charged content is both a help and a hindrance, says Lyons: “I don’t want to see any No coverage in my News Feed ahead of the marriage referendum and I’m happy the algorithm is keeping opposing views from me but I would like to see a broad choice of content coming up to a general election.”
Ultimately, it is the opacity of these algorithms that is the biggest problem, says Breslin.
“It’s not like having an editor who can decide on any one day that it is in the interests of a newspaper’s readers to see a more diverse range of news stories around an important breaking topic: these algorithms are increasing a user’s selective exposure to news, before the user can decide what to select themselves.”