Biz Tips: What’s the biggest lesson marketers can learn from the UK EU-Membership referendum challenge?

Biz Tips: What’s the biggest lesson marketers can learn from the UK EU-Membership referendum challenge?


What’s the biggest lesson marketers can learn from the UK EU-Membership referendum challenge?

And the answer isn’t about how to change opinions…

In his column for Marketing Week, Mark Ritson laid down a challenge to marketers regarding the UK’s EU membership referendum :

Did the illegal 6.4% overspend enable Vote Leave to win the day?

Was the additional spend of £449k (and change) critical in convincing enough people to vote leave and pave the way for Brexit? According to the article, marketers are the only people who can work this out.

I don’t agree with this and, while Mark Ritson qualifies it by saying “a particular marketer with advanced analytics training and nothing better to do for the next few days”, I still think there might be other particular people in other fields who would have as much chance of being able to prove it as a particular marketer. Assuming, that is, that they too had nothing better to do for the next few days.

However, this isn’t about the skills and abilities of marketers against other professions, this is about the working it out part. Personally, I just don’t think that’s something that can be done, and we might just have to accept that we’ll never know the answer. However, just thinking about this challenge can be a useful intellectual exercise in marketing analytics.

The over-simplified, Facebook-exclusive analysis

Let’s back-of-envelope this. According to Statista, the average CPM for Facebook ads in the UK is $3.15. If we convert that to GBP, we get £2.48 at the time of writing. In the run up to the the referendum, the UK electorate was around 46.5 million people. Using those figures, the campaign overspend could have paid for around an extra 181 million Facebook ad impressions; just under 4 per potential voter.

Given Facebook’s strengths in targeting particular users, it is likely that these impressions were concentrated on a small group of the electorate: perhaps those swing voters who had yet to finally decide, or those who might be thinking ‘leave’, but could yet change their minds. If there were around 10 million of those people, then the overspend could have paid for an additional 18 impressions per person.

Seeing an ad an extra 18 times sounds like it could be persuasive, but we’re just looking at the overspend here, the allowable £7 million budget could already have paid to serve each person in that group 282 impressions. If you had seen 282 adverts telling you that the UK was better of out of the EU and hadn’t been convinced to vote leave, would another 18 messages do it? And what happens when you put those into a real-life situation where you’re constantly bombarded with newspaper headlines, political rhetoric, or water cooler discussion of the issues with people at work or in the pub?

Obviously, this a very superficial presentation of the numbers, but it illustrates the complexities and problems of trying to attribute causal effects to marketing activities.

Why we can’t prove whether the overspend caused Brexit

Unless we can go back in time to the referendum announcement, copy and paste our planet into one universe where the campaign spend was legal, and one that had the over-spend, we’ll never know. And, even if we could do that, a sample size of one in each group is unlikely to be that convincing.

Unfortunately, in observational studies where we don’t have an interventional experimental design, we can’t really prove causation. Now, with enough data and a lot of care and attention to controlling for other variables, we can achieve a level of confidence that we are looking at a causal relationship.

In some cases, we can even get close enough to make lifestyle decisions based on them: there aren’t too many large studies where we divided a group of people into two and forced one group to smoke for forty years for example, but we’re fairly convinced it can increase your risk of lung cancer (or course, we have experiments that illustrate the underlying molecular mechanism for that as well).

Can machine learning and data science help?

Basically, our issue is with the number of variables, and how they could interact that cause the problem- particularly when we only have one real output — the referendum result. Yes, we could maybe break down the results at the level of each ward or constituency, look at what the result was, try and work out how much was spend in those areas and perform some regressions.

But could those areas have been targeted for increased spend because the campaign identified more voters who could possibly vote leave there? If that’s the case, we’d see a correlation, but we couldn’t infer a causal relationship from that.

We could look at opinion polls, but they can be quite noisy, and we would probably lose so much sample size that we couldn’t come up with meaningful regressions based on data from our individual wards or constituencies.

It doesn’t matter if we try to use the most-cutting edge boosted-tree machine-learning models, esoteric support vector regression models or some deep-learning algorithm, we just don’t have enough of the right data to work with; there are just too many unknowns.

To borrow an example from across the pond, we can say that Trump’s team ran a great election campaign, but what was the effect of the FBI’s statement regarding Clinton’s email accounts? We could ask thousands of people if that made them change their minds, but even if we extrapolate those numbers across the US, can we prove that statement lost her the election, or can we just say it was likely that it cost her the presidency? Do we even know if that is what changed their minds, or is that just what they think did?

Against a complicated backdrop of leave and remain discussions in the media, how can we absolutely know how the myriad advertising campaigns interacted and impressed their messages upon the electorate? How many people made up their minds on the morning of the election on the basis of the headlines on the papers they saw when they ran out to get the milk?

If you look at this chart from The Economist:

Figure from The Economist

you can see that the biggest increase in support for leave happened in the couple of months prior to the referendum. Did this correlate with increased marketing spend by the leave campaigns? Probably, but you would expect it to, as they would want to increase visibility as polling day approached. It probably also correlated with increased discussion on the news and increased headlines on the front pages.

We could also imagine that, as leave was lagging in the opinion polls quite significantly up until that point, perhaps pro-leave politicians increased the aggressiveness of their speeches around that point. When we start to look at how things correlate over time, we need to be particularly careful.

What this is though, is a great example for all of us in how to design our marketing campaigns with causation versus correlation in mind. If we design our campaigns appropriately, can we perform interventional experiments so with every campaign we learn something to help plan the next one?

When to experiment, and when to give it the beans

Just as with so many things in marketing, causation and attribution are often very difficult things in which to to have confidence. You can build as thorough and complicated a multiple regression model as you want, but unless you’re working with a case-controlled study, you can’t really prove anything completely definitively, but you don’t always have to. Importantly, you don’t want to compromise the potential effectiveness of your campaigns to try.

Correlation is easy, but proving causation can be hard. For many of our purposes though, correlation is good enough, and correlations can often be very useful in helping us make decisions. As long as we remember that they are just correlations.

Imagine you have a chain of stores across the country and you are planning a sale. You increase your marketing spend to promote that sale, and you find that you put more money in the tills that week. Did your marketing activity work? You increased spend, and takings went up, but did one cause the other?

Maybe you sold more because the offer was better? Did more people convert simply because things were cheaper? What if footfall increased? Did your campaign drive people into the shops, or did a change in the weather get more people out of their houses (or away from their barbecues) and onto the high street?

One way to test this would be to divide your store into experimental and control groups, and to only advertise the sale to the experimental group. If the stores in the areas that received the marketing consistently outperformed those in the control group, you can be more confident that your marketing interventions drove that increase.

Of course, you can’t do that. I wouldn’t want to be the person who presented the report on the performance of the sale and had to say, “Well, the good news is that we know our campaign activity was effective, the bad news is that we only advertised to half our market.”

And that sums up, large, do-or-die single events, like an EU membership referendum for example. Those aren’t the times to experiment. Sure, you might spend more than you needed to, but if you have a budget that is put aside to do one job, you use it. If someone gives you a hammer and says they’ll give you £1 million if you can crack a walnut with one hit, you don’t try and hit it just hard enough to open it, you hit it as hard as you possibly can.

Brexit challenge lessons for marketers

Bringing things back to Mark Ritson’s initial challenge, Marketing Week recently published three responses to the question. The consensus is that we won’t know, but the published opinions believe that the overspend probably didn’t matter, but what can marketers across all sectors learn?

Even if a detailed post mortem of the campaign isn’t required it can sometimes be helpful just to spend a few minutes thinking about it while the kettle boils for the 11 am cup of tea. As you analyse the campaign just gone, and prepare for the one to come, here are some things to think about:

  • Correlation doesn’t always mean causation, and be careful interpreting time-series data
  • Experiment when you can, so you have the information for when you can’t
  • Define what the objectives are and how the activity will be judged: do you need to be able to conclusively report on the causative effects of spend?
  • Think about what other variables might be at work, could something else have driven your results or limited campaign effectiveness?
  • Record as much data as feasibly possible and incorporate information from other sources
  • Consider KPIs at the planning stage; is the role of the campaign to get people to the website, or get them to convert? You can get a lot of traffic, but if no-one likes your product or service…

The Brexit saga will continue to run and run, and no doubt the overspend will be used as an argument for a second referendum. Ultimately though, it provides an opportunity for us marketers to look at what we do, how we do it, and think about just how much influence what we did had on the final result…

Thanks for reading The Marketing & Growth Hacking Publication

Follow us on Twitter. Join our Facebook Group. Subscribe to our YouTube Channel. Need a sponsored post written? Contact us.

If you enjoyed this story, please recommend 👏 and share to help others find it!

What’s the biggest lesson marketers can learn from the UK EU-Membership referendum challenge? was originally published in Marketing And Growth Hacking on Medium, where people are continuing the conversation by highlighting and responding to this story.

Join The Rockstar Entrepreneur Community Now: Start Rockin Now

Leave a Reply

Your email address will not be published. Required fields are marked *