Facebook has apologized (sort of) for so-called emotion manipulation experiments that came to light over the weekend. That’s when a massive study published online by the Proceedings of the National Academy of Science started suddenly drawing attention.
Though conducted over two years ago between Jan. 11 and Jan. 18, 2012, the recently published study results have set off a considerable debate. It’s another controversy for a social site constantly criticized for its use of member data.
The study apparently manipulated the news feeds of 689,003 members to determine whether sharing of emotional postings, both positive and negative, would affect their emotional state — and in the end how they engage socially.
But the uproar over the weekend led Facebook to reach out to followers in an effort to explain the company’s actions. In a post on his own blog, Facebook data scientist Adam D.I Kramer explained:
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
By way of apology, Kramer added:
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
Among commenters on Kramer’s post were those who seemed genuinely nonplussed by the experiment. Meanwhile, others echoed criticisms in the academic community that reference in Facebook’s terms of service to “research” doesn’t give the social network the right to use its members as guinea pigs without their consent.
As Kashmir Hill, who writes about digital privacy for Forbes notes:
“Based on Kramer’s remarks and Facebook’s statement, it’s evident that the company still doesn’t understand the core concern of critics: That testing whether users’ emotions can be manipulated through content curation is creepy.”
The fallout from this latest experiment is yet another reminder of the care businesses of all kinds must take when dealing with the data of your customers and community.
It’s also more evidence that Facebook is manipulating news feeds — including perhaps the news you’re sharing with your followers.
Facebook Photo via Shutterstock, PNAS