First, the basic facts:
- Some researchers at Facebook were on a mission to finds out if emotional states of people can be transferred online. Put simply, they wanted to test if your interaction on Facebook with people that are sad (or happy) can make you sad (or happy) as well.
- To conduct the experiment, the selected some 600,000 Facebook users and, without telling them, started showing them either less of happy posts by their friends, or less of sad posts.
- How they are able to do this is they can analyse each post to see what general mood it has. Plus, remember that Facebook has a mood status that a person can use to convey what mood they are in as part of a post.
- The result was that those that saw less of happy posts (that is more sad posts) became themselves sad. And vice versa – those that got more happy posts, became happy. After seeing the posts tweaked to test them, the people started posting updates that had similar emotions.
- This demonstrated that emotions can be transferred between people online.
Full details of the research can be found here.
The problem is this: Facebook essentially made 600,000 experimental guinea pigs without their permission, or without giving them an option to opt out at any point? If this was the usual A/B testing where users see different layout and different content just so Facebook can optimise the platform use (which most internet companies, including Google, do) then maybe it would be understandable. It’d just be the usual creepy stuff that internet companies do with our data.
But this is more than that. It’s an actual experiment testing something that doesn’t necessarily relate to your everyday use of Facebook as a product. The human is the subject of the research, and this involves altering their state of emotions. A state they may carry even outside the time they are on Facebook. One of these people could be you.
Do you think Facebook was right to do this without seeking the permission of the people they experiment on?