The research, which revealed the social networking giant could make its users feel more positive or negative through editing their newsfeeds has significant implications for brands.
Of course emotional engineering has long been the lynchpin of marketing; John Lewis leaves me weeping into my tea cup each Christmas through brute force creativity. We, their customers, are complicit. But this is something altogether different; social media offers a level of manipulation that exceeds that of traditional media channels.
Facebook in effect creates the illusion of a community of our own creation. It does not have an editor in chief, or an army of journalists. It relies instead on our own willingness to share, or at least our ever-increasing narcissistic tendencies, depending on your point of view, to create its own walled garden of content.
Through sharing their lives on Facebook consumers are, in effect, placing their trust in the community; a trust that has been trampled on by this poorly communicated research project.
Those commentators that argue that Facebook’s biggest mistake is making the research public are missing two key points. Firstly in the networked era ultra-transparency is not just a passing fad, it is a business necessity. Trust is the ultimate currency.
Facebook users may well have technically agreed to act as guinea pigs when they agreed to the social network’s terms and conditions. But user agreement riddles with small font legalese do not constitute informed consent.
Secondly, as marketers wise up to the fact that a Facebook "like" is the lowest form of human connection, the social giant needs to do more to prove its worth as an advertising platform. In short it is in Facebook’s business interest to prove the impact on its users' emotions.
Seduced by the social network
Technology is at its most seductive when what it offers meets our vulnerabilities. A Facebook "like" may pale in significance to a real life hug but it is still a form of validation. Social media may reduce our relationships to followers or connections, but the power to wound, or in this case manipulate consumers’ emotional state, is real.
Of course no one is forcing anyone to be on Facebook; users who feel enraged at being as a lab rat can simply disconnect. But for many users Facebook has become part of their social fabric and for the generation who are increasingly shaping their world online it’s a huge amount of power for a single company to weld.
For its part, Facebook’s chief operating officer Sheryl Sandberg has apologised for the research, telling users "we never meant to upset you".
It's misleading to suggest that the social network is attempting to create the "totalitarian state" some of its harshest critics suggest. Facebook may have made a mistake but it is difficult to believe its original idea was rooted in knowingly deceiving or upsetting its users.
But this single study is the undoubtedly the tip of the ethical iceberg.
As T.S Eliot writes in The Hollow Men, "between the idea and the reality falls the shadow". The possibilities for emotional engineering and manipulation on social platform such as Facebook pose a plethora of ethical dilemmas.
Marketers cannot afford to leave these questions in the shadows and bury the impact of technology, both good and bad, on their consumers’ lives.