Secret Facebook Test Riles Users

Secret Facebook Test Riles Users

A mass experiment in emotional manipulation

For all its success in attracting over one billion users, Facebook is surprisingly tone deaf when it comes to meeting their expectations. And this week, public uproar over the social networking giant's policies reached a new apex when it was revealed that Facebook had secretly and without permission altered how the service worked for almost 700,000 members so it could psychologically gauge their reactions to overly positive and negative news.

Facebook's response is about as clueless as the original secret study: It says that it has done no wrong and that users implicitly agreed to this kind of testing when they joined the service.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," one of the study's co-authors explained in a hard-to-find Facebook post. "We were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. [But] we didn't clearly state our motivations."

Here's what Facebook really did: It secretly changed the way Facebook worked for 689,000 users to see whether it could affect their moods by displaying only overly positive or overly negative posts in their Facebook newsfeeds. The plan was to monitor those users' own posts to see whether the skewed newsfeed impacted what they experienced and wrote about.

"The goal of all of our research at Facebook is to learn how to provide a better service," the explanation continues. "Our goal was never to upset anyone."

This isn't really about upsetting individuals, though as one poster wondered aloud, what if the skewed posts had resulted in a Facebook user committing suicide? The bigger issue is whether this kind of social engineering could be used for nefarious purposes. For example, what if Facebook was able to someday impact the results of a presidential or other election in order to ensure that its favored candidate made it to office? What if Facebook promoted discord or revolution in a country facing civil war?

The most astonishing aspect to this story, perhaps, is that Facebook could have simply asked for people to participate in a vaguely explained study, and it probably would have been quickly overwhelmed by interested users. Instead it unleashed what it described as "emotional contagion" on an unsuspecting user base. Incredible.

And it is perhaps illegal too. As noted first The Guardian, Maryland University law professor James Grimmelmann claims Facebook failed to obtain the "informed consent" of the study subjects as required by US laws that create an "ethical and legal standard for human subjects research." "Federal law requires informed consent," he writes. "The study harmed participants ... The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That's psychological manipulation."

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.