Social media and The X Factor: what we learnt [infographic]

•    Social media predicted the winner
•    There’s no ‘hindsight bias’ here
•    Tulisa gains most Twitter followers for the first time in the series

With the eighth series of The X Factor now drawing to a close, we are able to reflect on what we have seen and learnt over the last 10 weeks.  For those that have been following along with our weekly blog posts since the live shows began, our focus has naturally been on social media – and the role this has played both for the contestants, but also for the programme makers.

We have crunched the numbers and analysed the data each week, and we’ve done it manually – investing many, many hours in the process.  We’ve not had access to the qualitative insights a robust social media measurement tool can provide, nor the data and analytics that the X Factor in-house teams do. There are a plethora of tools out there we could have looked at, but for consistency, we elected to remain with readily available figures that aren’t open to interpretation.  The numbers are what they are.  Regardless, the limited data that we have captured has been able to provide an incredible amount of insight, which we hope those of you that have followed along over the weeks have seen.

There’s a phrase that can be aptly applied to many of the reports that popped up since the show has ended, “hindsight bias.” It’s easier to see events clearly after they have already occurred than to analyse activity, identify patterns and predict outcomes whilst it’s still unfolding.  The “knew it all along effect” is something we see with many social agencies (be it social CRM, social monitoring, conversation agencies etc.), delivering actionable insights within real-time environments is a more challenging path to tread.

There are many lessons to be learnt from the whole process, but for me, I hope the biggest message is not actually related to the X Factor or even social TV.  It’s that it is possible to gain real and tangible insight into your social media activity (and thereby demonstrate its value internally), without investing a large slice of your social media budget (if you are lucky enough to have such a thing!) on measurement tools.  We’ve not done anything that you can’t do yourself.  Just to be clear, I’m not suggesting that monitoring tools don’t have a valuable role to play (they certainly do) just that we shouldn’t be blinkered to the resources we have available at our own fingertips.

From the outset, we have been very clear that the data that we were capturing couldn’t encapsulate all aspects of the show, nor what is happening in the social media space.  Rather, that it would be a sound indication of what the public are thinking and feeling towards individual contestants, and how what happens outside of the show can impact on a contestant.

10 weeks ago on our first X Factor blog post for The Wall, these are the questions we set out to answer:

1.    Do ITV’s viewing figures correspond to growth and engagement across social channels?
We have seen quite clearly that viewing figures do correlate to the number of fans contestants gain during live shows (discussed in more detail in Week 9).

2.    Do the judges comments (negative and positive) correlate to the finalists audience size?
They definitely have an impact, most notably Misha B experienced this first hand during this series.  Up until Week 4, Misha B was a rising star in terms of her social media growth, which went (and continued to go) downhill throughout the rest of the series.  She never recovered in a social media environment from that point, and continually found herself in the bottom two since that show.  Even judge Gary Barlow stated in Week 9 that the bullying allegations killed her chances of winning.  The social media figures back this up, which is discussed in more detail in Week 4.

3.    Do contestants that open and close the show see bigger growth during the week?
This is not something we discussed during the weeks, we found no correlation between opening and closing the show and social media growth.

4.    Do contestants that have been given the boot drop off the social media radar?
As was to be expected, the growth of former contestants social media channels slowed down considerably once they left the show.  However, they didn’t stop entirely, and none experienced negative growth.  What helped former contestants was what they did to maintain their profile ‘in the real world’ by gigging and appearances, as well as staying active across Twitter and Facebook.  Those that let their channels stagnate after their departure naturally saw a much slower pick-up of new fans. It’s worth noting Amelia Lily is a positive example of this, staying active after her exit from the show by various gigs around the country until her return.

5.    Can we predict who will be in the bottom two acts based on the growth and engagement on their social profiles?
In the early weeks, no, not at all.  During the first half of the live shows, there are not as many strong connections with individual contestants.  Despite it not being the first time the X Factor has been on TV, the viewing public seem to need to be reminded continually that their favourite is not safe and they need to pick up the phone and vote for them.  We saw this with Sophie Habibis early on, who was one of the most popular contestants on social media each week.  It’s also evident through Little Mix, who up until Week 4 were in the bottom of the pack each week until they found favour with the public.  In the latter half of the series, it was more clear who would not make it through to the final and in fact be leaving that week, such as with Craig Colton, Janet Devlin and Misha B.

6.    Can we use this to predict who will leave each week?
A very similar answer to the question above.  There are so many variables, such as the contestants personality coming across, to their individual performance on the night that it is very difficult – particularly in during Weeks 1 to 5 to get a clear indication from social media on this.  It was much clearer in the last half, particularly Weeks 7 to 10.

7.    Is there any correlation between the size of an individuals social profile and the amount it grows by and the order that the safe contestants are announced?
This is not something we covered during the 10 weeks, as we didn’t see any pattern here.

8.    And the penultimate question – can social media determine the winner of The X Factor 2011?
Not a straight-forward question to answer.  The analysis clearly shows a strong connection between popularity across social media and how they progress through the show.  There is no doubt that the work a contestant puts in to engaging with their fans across Facebook and Twitter has a significant role to play, but it is not the deciding factor on whether they can or will win the show.  What we have seen, is that monitoring social media can give us a very clear indication of who will win (evident from Week 7), and who may go (as discussed, in the latter part of the series).

It’s worth noting that the figures for YouTube videos weren’t updated.  We’re aren’t sure if it was a glitch (we haven’t noticed any report of this), but they were back correctly presented from Tuesday and for this reason we haven’t included the usual look at YouTube views.  Looking at this weeks show, Little Mix certainly got the two-screen audience behind them amassing nearly 70,000 new fans in less than two days; leaving Amelia Lily (20,603 fans) and Marcus (30,489) in their dust.  For the first week, Little Mix were also the act with the most Facebook fans, with 119,150 Likes.

Mentor and member of N-Dubz Tulisa also benefited from the excitement around the final, for the first time this week she gained the most Twitter followers of all the judges – almost 10,000 more than she usually receives, and 8,000 more than Kelly Rowland and 11,000 more than Gary Barlow.

Engagement was high across Facebook for each of the three finalists; each seeing more than 1,000 additional people interacting with their official Facebook page than before the shows.  The girl band also grew the most over the course of the week, with 127,800 new Likes and Followers – more than 50,000 less than Amelia Lily (75,663) and Marcus (71,236).

Former contestants also benefited from the higher than normal viewing figures and excitement around the final, with almost all of the 13 former contestants increasing at a higher rate than they did in Week 9.

Being able to monitor and understand activity and impact in a social media environment is incredibly important, particularly as television and programme making evolves.  It enables programme makers and the audience to influence and interact with activity as it unfolds and can, because of the way we use and consume social technologies, influence the conceptualisation of programmes.

The X Factor US, which started not long after the UK series, seem a step ahead of X Factor UK in terms of how the show is embracing social media.  Not only is it accepting social media voting via Twitter and Facebook, tweets and #hashtags are being discussed in the main broadcast (as opposed to just being talked about on the after show), with the host actively encouraging viewers to get involved.  The X Factor US app provides a much richer experience for fans of the show as well.  Where the UK show has a “tap to clap” app, American viewers can vote in real-time on the contestants outfit, performance, staging and song choice which can be shared on a social network.  They can also, in real time, have a choice of which cameras they would like to view including those pointed at the audience and backstage.

Overall, it has been a much more socially progressive show than previous seasons, although there are many lessons to be learnt – both from what is being done across the pond, but also what has unfolded here in a digital environment.  We very well may see social voting in the UK or the 9th series of X Factor next year, as well as better integration with social, the live show and mobile.

What have you made of the show overall and its use of social media?

Week 9: The X Factor: Social media and the live shows infographic
Week 8: The X Factor: Social media and the live shows infographic
Week 7: The X Factor: Social media and the live shows infographic

Week 6: The X Factor: Social media and the live shows infographic

Week 5: The X Factor: Social media and the live shows infographic
Week 4: The X Factor: Social media and the live shows infographic

Week 3: The X Factor: Social media and the live shows infographic
Week 2: The X Factor: Social media and the live shows infographic
Week 1: The X Factor: Social media and the live shows infographic

Rachel Hawkes, is an account director at communications consultancy Elemental @elementalcomms.

  • John Barton

    Hi Rachel,

    That’s an excellent infographic with valuable quantitative data and comprehensively presented. Based on our experience, I did feel it relevant to the debate to offer my opinion on a couple of your points above. We recently released a qualitative social media XFactor study focused on weekly mentions based on propensity to Win. If you haven’t already come across it you can read the full study, see our methodology etc. here:

    Firstly In reference to “the numbers are what they are” I quite agree. There is no arguing with hard stats. They are also relevant and interesting. However what do they really mean? When speaking about social media trends this is the first thing any client will ask you. If you choose to set the parameters, stopping at the numbers and draw what conclusions you can then that’s fine. It is incorrect however to imply that a more qualitative study with the aid of a monitoring tool is in some way less consistent.

    As long as the parameters are set and agreed at the beginning of the study and form the basis of the study to its conclusion, it should be as robust as any other sample based study. If this opportunity is not at acknowledged it is tempting to draw conclusions such as more likes/views etc. = more votes and make that a definitive indicator – as outlined here:
    (Appreciate that you have caveated that in the article above).

    For instance, we found that Frankie Cocozza’s performances were often more popular in terms of noise, views etc. than the votes he actually received. That’s where a good qualitative strategy comes in.

    In terms of monitoring Tools, you make a good point here. Many make the mistake of thinking tools will take the pain of monitoring away. Whilst it is almost impossible to monitor and rank effectively at great volume without a tool, it should be regarded as just that, a tool. It should be used to help drill down and eliminate irrelevant ‘noise’. Not as a robust indicator of propensity, sentiment or otherwise. In short, human analysis is essential and any good CRM/monitoring platform will actively promote this integrated approach.

    In terms of what you refer to as “Hindsight bias”. Aren’t all studies based on the learnings of hindsight? I think any good monitoring/Social CRM Agency or otherwise would be at pains to suggest they will predict the result before the event itself with any firm commitment.

    However if data collected points to this trend (in the case of the bottom two with the Xfactor) that is another matter entirely. In terms of “bias” not sure how this would benefit anyone involved as all data pulled is in the public domain. I think the point I am trying to get across is that quantitative and qualitative study both have their place in social media monitoring and research. Ideally alongside one another.


  • John Barton

    Formatting and spacing doesn’t exactly translate well from the comments box either! Apologies, John

  • Peter Wood

    Hi Rachel, a bit of a strange post really. Firstly, social media tools aren’t a large part of anyone’s budget. Opening a profile on Radian 6 for a month costs under £400. If that’s a large part of a clients budget, then you’d have to wonder how much opportunity there is in the first place.

    As for the idea that you can perform analysis without those tools. Well, of course you can. The tools tap into publicly accessible data. They just make it easier to weed out irrelevant noise and delve deeper than simple popularity uplift measurements.

    The measurements you’ve taken are the very basic, if I presented those stats to a client, I think they’d ask for their money back. I’m really not sure what ‘hindsight bias’ references but the whole article seems like a massive dig at people who’ve taken time out to analyse more than just the most rudimentary stats.

    ‘delivering actionable insights within real-time environments is a more challenging path to tread’

    I thought when you made the above statement, you had the answer in your post, but I couldn’t find it?