Facebook Lies about its Mind Experiment; Doesn’t Care about Your Concerns

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone

I wrote the other day that Facebook conducted a massive psychology experiment on users, and you may have been a guinea pig. A lot of pixels have been spilled about this, and some of the information that has come out about the experiment is interesting.

First, the people behind the experiment just don’t understand why people are upset. In a statement on a Facebook post (of course), Adam Kramer, the lead researcher, said:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.

And he did express some surprise at the reaction, and does sort of apologize:

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

But his apology was for “the way the paper described the research and any anxiety it caused,” and not the experiment itself.

Meanwhile, Forbes has a great deal of information about this study.

First, they say that there was no institutional review board at the universities involved, only within Facebook, who is certainly not the right group to decide whether or not this type of study is ethical. The editor of the journal that published the study said that “the data analysis was approved by a Cornell Institutional Review Board but not the data collection.”

“Their revision letter said they had Cornell IRB approval as a ‘pre-existing dataset’ presumably from Facebook, who seems to have reviewed it as well in some unspecified way,”

Then, Cornell University stated that they didn’t need to review the experiment:

Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.

Even more damning, Facebook, who claimed that all users agreed to participate in “research” as part of their terms and conditions, only added the word “research” four months after they carried out the experiment. And there was no age filter: so users from ages 13 to 18 may have been part of the study, which is a clear violation of the law in the US and other countries.

The New York Times reports that British regulators are looking into this case, and the US Federal Trade Commission is also considering an investigation.

So there was no informed consent of any kind, no institutional review board, and Facebook has potentially tested on adolescents, which is illegal. Can you trust them with anything at all?




1 reply
  1. jfutral says:

    “the way the paper described the research and any anxiety it caused,”

    Possibly ironically, this may cause more anxiety than any stream of negative or positive posts on anyone’s timeline.

    What I am most interested in is if Facebook is found to have broken any laws, what kind of punishment could possibly be imposed to make any kind of difference? I mean, “D’uh” factor removed, either in it’s findings or that Facebook would do this, what does a judicial system do? Fine them? Imprison someone? Facebook has shown itself surprisingly resilient to most any bad press. People want what people want. I can’t figure out how this will have any long term negative impact or change the internal culture at Facebook regarding news-feed algorithms. Sadly.

    Joe

    Reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply