In an experiment it conducted earlier this year, Facebook injected the feeds of nearly 700,000 its users with negative content to see if it would make the posts they wrote more negative. The researchers believe that it did. The mood of the posts seen in the news feeds of the experiment’s subjects moved like a “contagion” into the posts of said subjects. The inverse, too, was true, the researchers say. Facebook actually messed with the moods of its users, who — allow me to remind you — are Facebook’s bread and butter. Exposing users to the advertisements of Facebook’s partners — on and off Facebook – is the social media giant’s only real business. So it’s surprising that Facebook would conduct such experiments, and even more surprising that it would be dumb enough to publish the results. The paper was published in the Proceedings of the National Academy of Sciences.
Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world. But few users expect that Facebook would change their News Feed in order to manipulate their emotional state. We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.