The Facebook Emotion Experiment (It Probably Involved You)
There’s a lot of press happening right now surrounding an experiment that Facebook conducted back in 2012 on more than 680,000 Facebook users. The experiment is being dubbed “the emotion experiment,” and it did, as the name suggests, mess with people’s emotions, intentionally, in order to provide Facebook with additional marketing information.
Here’s a closer look at the experiment, and what’s happening with the investigation involving details of the experiment right now.
Playing With Emotion
Facebook selectively removed positive news updates from user feeds for one week. As a result, 0.1% of users posted less positive updates in their own posts. In turn, this led people to begin publishing more depressing updates. The whole thing was a sort of Domino effect – show people sad or depressing stuff, and people will respond with more sad and depressing stuff. Facebook’s aim in conducting this experiment was to figure out how people respond to emotion, and to determine whether or not the belief that an overabundance of happy news updates makes people sad – which, in this case, it didn’t.
The problem with the Facebook experiment is that the company didn’t notify users that the experiment was happening. The other obvious problem here is that the company tried to make users depressed – and it worked. Ethics aside, the really scary thing about the Facebook experiment is that it proves just how powerful major tech companies are.
Facebook had the power to depress thousands of people – imagine what might happen if Facebook decides to evoke a different emotion. Could a social media company influence the way that thousands of people think? This kind of emotionally experimentation could even lead people to riot, to injure others or themselves, and to believe anything that Facebook wants people to believe.
Is It So Wrong, Though?
Companies test emotions all the time. It’s called A/B testing, and it’s why those commercials tug at your heartstrings, why you cry when you hear a song, and why some online ads just speak directly to you. Facebook hasn’t really done anything that other companies don’t already do, but because it’s Facebook, they are being held seriously accountable.
Measures Being Taken
A US watchdog group (Electronic Privacy Information Centre) has recently filed a complaint about the experiment with the US Federal Trade Commission. The group insists that the Trade Commission look into the Facebook experiments. Facebook states, in the company’s defence, that the testing isn’t out of the ordinary, and that it’s all legal according to Facebook’s privacy policies (that all users must agree to prior to using Facebook). So, I guess the question now is: how badly do you want to use Facebook? Or, it might be: is this really such a big deal?
Facebook is still under investigation for this experiment (at the time of this writing), but there’s bound to be some kind of backlash. In the meantime, it’s up to users to determine whether or not the social network has violated personal ethics – if so, it might be time to disconnect.