Facebook researcher apologises over emotion experiment

Social network came under criticism after attempting to temporarily manipulate users’ feelings

The experiment found that users’ behaviour was likely to be affected by the amount of positive or negative posts in their feed. Photograph: Bloomberg
The experiment found that users’ behaviour was likely to be affected by the amount of positive or negative posts in their feed. Photograph: Bloomberg

A Facebook Inc. researcher has apologised after conducting an experiment that temporarily influenced what almost 700,000 readers saw on their news feeds, reviving some customers' concerns about privacy issues.

The number of positive and negative comments that users saw on their feeds of articles and photos was altered in January 2012, according to a study published June 17 in the Proceedings of the National Academy of Sciences, a US scientific journal.

People shown fewer positive words were found to write more negative posts, while the reverse happened with those exposed to fewer negative terms, according to the trial of random Facebook users.

Adam Kramer, a Facebook data scientist who was among study’s authors, wrote on his Facebook page yesterday that the team was “very sorry for the way the paper described the research and any anxiety it caused”.

READ SOME MORE

The data showed online messages influence readers’ “experience of emotions,” which may affect offline behavior, the researchers said.

Some Facebook users turned to Twitter to express outrage over the research as a breach of their privacy. “Facebook knows it can push its users’ limits, invade their privacy, use their information and get away with it,” said James Grimmelmann, a professor of technology and the law at the University of Maryland, adding that the website has “done so many things over the years that scared and freaked out people.”

But Mr Grimmelmann said the anger won’t have a long-lasting effect. While some users may threaten to leave Facebook, most people “want to be where there friends are” and there is no alternative to the social networking site that provides more privacy.

In the study, the researchers, from Facebook and Cornell University, wanted to see if emotions could spread among people without face-to-face contact. The Facebook study is “really important research” that shows the value of receiving positive news and how it improves social connections, said James Pennebaker, a psychology professor at the University of Texas.

Facebook might have avoided some of the resulting controversy by allowing users to opt out of taking part in any research, he said. “It will make people a little bit nervous for a couple of days,” he said in an interview.

“The fact is, Google knows everything about us, Amazon knows a huge amount about us. It’s stunning how much all of these big companies know. If one is paranoid, it creeps them out.”

Facebook said none of the data in the study was associated with a specific person’s account. Research is intended to make content relevant and engaging, and part of that is understanding how people respond to various content, the Menlo Park, California-based company said in a statement yesterday.

“We carefully consider what research we do and have a strong internal review process,” Facebook said. “There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Bloomberg