Facebook Conducts Massive Psychology Experiment on Users, and You May Have Been a Subject

06/28/2014

A group of scientists from Facebook, the University of California, San Francisco, and Cornell University has conducted a study, Experimental evidence of massive-scale emotional contagion through social networks, on nearly 700,000 unwitting Facebook users. This study shows that:

emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.

The study was based on the observation that:

On Facebook, people frequently express emotions, which are later seen by their friends via Facebook’s “News Feed” product.

To perform this study, they “manipulated the extent to which people were exposed to emotional expressions in their News Feed.” In other words, Facebook messed with your head. They discovered that:

for people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred.

Now there are a lot of issues here, but I think the first is that of consent. The A. V. Club points out that:

In order to sign up for Facebook, users must click a box saying they agree to the Facebook Data Use Policy, giving the company the right to access and use the information posted on the site. The policy lists a variety of potential uses for your data, most of them related to advertising, but there’s also a bit about “internal operations, including troubleshooting, data analysis, testing, research and service improvement.” In the study, the authors point out that they stayed within the data policy’s liberal constraints by using machine analysis to pick out positive and negative posts, meaning no user data containing personal information was actually viewed by human researchers. And there was no need to ask study “participants” for consent, as they’d already given it by agreeing to Facebook’s terms of service in the first place.

But this is not the way experiments work in academia, and I’m very surprised that the institutional review boards of the universities involved accepted a study using participants that were totally unaware that they were being psychologically manipulated.

This is yet another reason to not trust Facebook. I’ve been on the fence about Facebook for a long time; I don’t use it a lot, mainly to keep in touch with people I haven’t seen in ages, or with internet friends. My Macworld colleague Chris Breen wrote, a couple of years ago, about why he left Facebook. He was more concerned about the privacy implications. But now, I’m going to think carefully about leaving Facebook myself. Recently, I’ve seen a number of things coming up in my news feed – often from other people posting on friends’ news feeds – that are angry, divisive, and just plain stupid. Many of them are the sort of viral conspiracy theory that makes Facebook a bad place to hang out; and other things that just annoy are the constant quizzes people post, asking “What kind of [noun] are you?” Oh, and Buzzfeed.

But the takeaway from this study is that Facebook can, and does, manipulate what you see for various reasons. In this case, it was a study to see if they could make you unhappy or happy; in others, it may be to get you primed to react positively to certain types of ads. (Though you can banish ads using the F. B. Purity browser plug-in.) No matter what the reason, the fact that they now know that the can do this should make us all think very carefully about our relationship with Facebook.

P.S.: While I think he exaggerates a lot, Noam Chomsky’s Manufacturing Consent (Amazon.com, Amazon UK) shows just how easy it is for the media to manipulate much of what we think. I try to avoid the tinfoil hat type thinking the Chomsky propagates, but much of what he says there is valid, if dated.