Facebook data scientist explains why it experimented on its users: We care about you

Facebook data scientist Adam Kramer (aka Danger Muffin) – the man who wrote and designed the controversial Facebook mood study – has put up a public post on Facebook explaining why it experimented on the emotional impact of Facebook on its users.

 

Defending the work of Jamie Guillory, post doctoral fellow at UCSF, and Jeff Hancock from Cornell University, Kramer writes they did the research “because we care about the emotional impact of Facebook and the people that use our product”, but admits the motivations for the study are not clearly stated in the paper. He apologised for the anxiety the study has caused.

 

“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” he writes.

 

“At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”

 

Kramer also outlines the methodology for the research “by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012).”

 

The study was published on June 2 in PNAS (Proceedings of the National Academy of Science) Social Science.

 

Here is the full text of Kramer’s post:

 

“Okay, so a lot of people have asked me about my and Jamie and Jeff’s recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.

 

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.

 

And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.

 

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

 

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.”

 

It’s not the first time Facebook has experimented on users with a study in 2012 looking at social influence and political mobilisation, which also cites Kramer as an author. 

COMMENTS