Facebook’s experiment raises the issue of manipulation and unintended consequences
There has been a lot of fuss lately about the psychological experiment that Facebook conducted on nearly 700,000 of its users. In order to gauge how people’s Facebook “News Feeds” affect their moods, the company temporarily implemented a new algorithm to display slightly more positive messages to some users, and slightly gloomier ones to others. As it turns out, people’s posts shifted to reflect the tone of their friends’ posts.
But the furor missed some of the most interesting questions, focusing (as usual) on Facebook’s tone-deafness (as usual). Nobody seemed interested in the obvious question of whether the findings reflected a genuine shift in mood, or simply a desire –conscious or unconscious –to fit in.
What has people outraged is the notion that Facebook is manipulating its unwitting users to advance its own agenda, with many citing the secrecy surrounding the research to illustrate the company’s misconduct (though the company published the results with no apparent sense of unease). But, though Facebook’s lack of transparency is certainly disconcerting –as is its deafness to its users’ concerns –these complaints miss the point.