#FacebookExperiment – was it all just a storm in a teacup?

tea cupAs the dust of the now infamous #FacebookExperiment begins to settle, we wonder what have we learnt? The experiment, which left some users renouncing their profiles and put the social media giant under investigation, has been described as everything from an important practice to an Orwellian nightmare – so let’s take a step back and look at the bigger picture.

For a week in 2012, Facebook conducted an experiment to analyse the site’s emotional impact on its users. A collective of researchers, in conjunction with Cornell University and The University of California, San Francisco, varied the tone of some 700,000 of its users’ newsfeeds to see whether reducing positive or negative content would emotionally affect users or alter posting behaviour.

In principle, this is straightforward website optimisation, which all big e-commerce, news, content and social media sites perform regularly in the constant fight to win, and keep, users.

While I was digital marketing manager at Myspace, daily tests were carried out to make the site more sticky. Around 2008, Myspace conducted a survey similar to Facebook’s experiment to understand the behaviour and output of specific communities to retain users and help advertisers deliver the right content.

In this case, Facebook validated the theory of emotional contagion. This may not seem ground-breaking except that many people, including users themselves, often presume that seeing positive content on social media causes inverse emotions such as jealousy and feeling left out.

Following the backlash, one of the experiment’s key researchers, Adam Kramer, said the intention was always to investigate this presumption alongside the possibility that exposure to negativity would mean reduced usage of Facebook. It is something of an unfair irony that, as Kramer puts it, “the research benefits of the paper may not have justified all of this anxiety.” A study of 0.1% of Facebook’s 1.23 billion monthly active users is still a considerable figure and the initial feeling of intrusion was warranted. But the point at which this experiment became contentious – the two-year delay in the study surfacing and the failure to ask users’ permission – was avoidable.

For the vast majority of the Facebook population, however, this story will have had negligible effect on site usage and those professing its demise are totally outnumbered. Facebook is just too good a communication tool to suddenly boycott, and its global benefits far outweigh the passing drawbacks or moments of hysteria.

Though Facebook made a mistake in not being upfront about the experiment, it was a necessary part of any product’s maintenance – Sheryl Sandberg’s decision to apologise for the communication of the study to the public but not the study itself is testimony to this.

For us the user, this is a poignant reminder that when we sign up to a service like Facebook and willingly upload our personal data and content, we are handing it over to a corporate, profit-making entity that is always striving to make its service better and generate more money from advertising. Ultimately, if the service you’re using doesn’t cost any money, you are in fact paying with your data and your attention, a global commodity now more valuable than oil, coffee or gold.

Chris Harris is co-founder and director of social media agency Harkable.