Facebook Lies about its Mind Experiment; Doesn’t Care about Your Concerns

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone

I wrote the other day that Facebook conducted a massive psychology experiment on users, and you may have been a guinea pig. A lot of pixels have been spilled about this, and some of the information that has come out about the experiment is interesting.

First, the people behind the experiment just don’t understand why people are upset. In a statement on a Facebook post (of course), Adam Kramer, the lead researcher, said:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.

And he did express some surprise at the reaction, and does sort of apologize:

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

But his apology was for “the way the paper described the research and any anxiety it caused,” and not the experiment itself.

Meanwhile, Forbes has a great deal of information about this study.

First, they say that there was no institutional review board at the universities involved, only within Facebook, who is certainly not the right group to decide whether or not this type of study is ethical. The editor of the journal that published the study said that “the data analysis was approved by a Cornell Institutional Review Board but not the data collection.”

“Their revision letter said they had Cornell IRB approval as a ‘pre-existing dataset’ presumably from Facebook, who seems to have reviewed it as well in some unspecified way,”

Then, Cornell University stated that they didn’t need to review the experiment:

Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.

Even more damning, Facebook, who claimed that all users agreed to participate in “research” as part of their terms and conditions, only added the word “research” four months after they carried out the experiment. And there was no age filter: so users from ages 13 to 18 may have been part of the study, which is a clear violation of the law in the US and other countries.

The New York Times reports that British regulators are looking into this case, and the US Federal Trade Commission is also considering an investigation.

So there was no informed consent of any kind, no institutional review board, and Facebook has potentially tested on adolescents, which is illegal. Can you trust them with anything at all?

Facebook Conducts Massive Psychology Experiment on Users, and You May Have Been a Subject

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone

A group of scientists from Facebook, the University of California, San Francisco, and Cornell University has conducted a study, Experimental evidence of massive-scale emotional contagion through social networks, on nearly 700,000 unwitting Facebook users. This study shows that:

emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.

The study was based on the observation that:

On Facebook, people frequently express emotions, which are later seen by their friends via Facebook’s “News Feed” product.

To perform this study, they “manipulated the extent to which people were exposed to emotional expressions in their News Feed.” In other words, Facebook messed with your head. They discovered that:

for people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred.

Now there are a lot of issues here, but I think the first is that of consent. The A. V. Club points out that:

In order to sign up for Facebook, users must click a box saying they agree to the Facebook Data Use Policy, giving the company the right to access and use the information posted on the site. The policy lists a variety of potential uses for your data, most of them related to advertising, but there’s also a bit about “internal operations, including troubleshooting, data analysis, testing, research and service improvement.” In the study, the authors point out that they stayed within the data policy’s liberal constraints by using machine analysis to pick out positive and negative posts, meaning no user data containing personal information was actually viewed by human researchers. And there was no need to ask study “participants” for consent, as they’d already given it by agreeing to Facebook’s terms of service in the first place.

But this is not the way experiments work in academia, and I’m very surprised that the institutional review boards of the universities involved accepted a study using participants that were totally unaware that they were being psychologically manipulated.

This is yet another reason to not trust Facebook. I’ve been on the fence about Facebook for a long time; I don’t use it a lot, mainly to keep in touch with people I haven’t seen in ages, or with internet friends. My Macworld colleague Chris Breen wrote, a couple of years ago, about why he left Facebook. He was more concerned about the privacy implications. But now, I’m going to think carefully about leaving Facebook myself. Recently, I’ve seen a number of things coming up in my news feed – often from other people posting on friends’ news feeds – that are angry, divisive, and just plain stupid. Many of them are the sort of viral conspiracy theory that makes Facebook a bad place to hang out; and other things that just annoy are the constant quizzes people post, asking “What kind of [noun] are you?” Oh, and Buzzfeed.

But the takeaway from this study is that Facebook can, and does, manipulate what you see for various reasons. In this case, it was a study to see if they could make you unhappy or happy; in others, it may be to get you primed to react positively to certain types of ads. (Though you can banish ads using the F. B. Purity browser plug-in.) No matter what the reason, the fact that they now know that the can do this should make us all think very carefully about our relationship with Facebook.

P.S.: While I think he exaggerates a lot, Noam Chomsky’s Manufacturing Consent (, Amazon UK) shows just how easy it is for the media to manipulate much of what we think. I try to avoid the tinfoil hat type thinking the Chomsky propagates, but much of what he says there is valid, if dated.

Facebook Won’t Let Me Create a Page With My Name Spelled Correctly

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone

I use Facebook to stay in touch with friends and family, but I was thinking of creating a page for professional usage: sharing articles I publish on this blog, on Macworld, and on other web sites. When I tried to name the page McElhearn, I saw this:


Given the large number of names with inter-caps – not only Irish names, but others – and the number of companies with inter-caps as well, I find it astounding that Facebook would have this type of limitation. I’m perplexed. I’m going to try to contact Facebook, but I know that’s as useful as pissing into the wind…

Update: I found that after you create the page, you can edit the name on the Settings page, to correct the capitalization. So if you come across this issue, you’ll find that it’s easy to fix. You can like my Facebook page if you want.

Facebook Wants Way Too Much Information

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone

I was interested to see that Facebook has released a WordPress plugin. This would allow me to easily share content from this blog to my Facebook page.

I installed the plugin, and went to configure it, and, after creating an “app” on Facebook, saw the following:

Seriously? My mobile phone number? MY CREDIT CARD? WTF? Facebook, do you really think I’m giving you my credit card number?

Needless to say, I have deleted the plugin.