Tech is radically reshaping political campaigning: regulators must catch up
This week the Information Commissioner’s Office (ICO) published their review into political parties’ use of data, along with a £500k fine for Facebook, relating to the Cambridge Analytica data breach. The very short, simplified version is this: most of the UK political parties have probably been breaching multiple data protection rules; they need to get their act together; and we desperately need new regulations.
Part of the problem is that data use in campaigns is getting very complicated. The impression I had from reading the review (which is available here) is that political parties have amassed a lot of data about voters, and are using all the latest profiling techniques and targeting systems (third party data, inferred data, Facebook custom audiences, look-a-like audiences etc) to reach us. But they don’t really know what they’re doing. They don’t seem to know, for example, what constitutes personal data, and are making a series of assumptions about how they can legally use third party data and various targeting services. I don’t blame them: they’re dealing with chains of data and targeting algorithms that are beyond the comprehension of even the most well-meaning campaigner. The ICO took a sensible approach – not to start meting out fines, but to issue a warning, ask the parties to shape up, and call for new rules to clarify what’s what. In my experience, many organisations breach rules accidentally when they don’t know what to do. (That said, unclear regulations are often exploited by organisations).
The bad news, of course, is that things are going to get far more complicated. As part of the ICO’s work, we at Demos were asked to produce a short paper sketching out how key tech trends – especially in the commercial online marketing and advertising world, where these techniques are often first developed – might change political campaigning in future (here it is).
First: increasing customer segmentation will allow audiences to be divided into ever smaller groups on the basis of increasingly granular insight about their demographics, behaviour and attitudes. Lots of companies claim to have huge databases which combine e-mail addresses, social media handles, and all sorts of demographic, geographical, cultural and interest-based data (especially device use meta-data) to create ever more precise profiles. As you’ve no doubt heard several times – we’re also producing more and more data all the time, especially with the growth of internet enabled devices. The end game is for audiences to reduced to a target of one – i.e. personalised profiles of each individual, based on an increasingly wide range of data and insight. (And improved look-a-like modelling to identify potential supporters and voters). In this scenario it wouldn’t be surprising if ‘psychographics’ – the psychological profiling technique offered by Cambridge Analytica – became a more common technique, perhaps using facial recognition technology to help map moods.
Second: more cross device targeting will mean increasingly sophisticated ways – both probabilistic and deterministic – to gain a ‘user centric’ view of a person, and target them across devices. In other words, you’re followed around across multiple devices by politicians. They’ll come at you through video, addressable TV spots, desktop display ads. (Videos are likely to be increasingly important here, for reasons we explain in the report). And one day, perhaps even through your smart home devices. (Your fridge, coffee machine, baby monitor, etc).
Third: it is possible that AI will prove better than human strategists at working out exactly who should be targeted, when, and with what content, in order to maximise persuasive potential. In other words – more automation of the targeting and messaging of political adverts. One could imagine how an AI-assisted service would be capable of pulling together vast amounts of data from across different sources, and identifying relationships likely to remain invisible to human analysts. That would of course be used to monitor and improve the performance of political campaigns, through intense A/B testing and automatic iteration. This could also be used in the creation of messages. In perhaps the most obvious use case, natural language generation tools could be used alongside algorithmic targeting in order to automatically generate content for unique users based on insight about their interests and concerns. Such campaigns could combine the interactive element of chatbots with personal data to serve adverts that incorporate a back-and-forth interaction, potentially referencing previous interactions or stated concerns with new generated pieces of content.
Taken to its logical conclusion, this combination of trends could lead to a stream of unique, personalised auto-generated messages targeted at each voter, through multiple devices, constantly updated based on A/B testing. It might sounds like sci-fi, but even just a decade ago the idea of programmatic adverts, psychographic profiling based on Facebook likes, and micro-targeted sounded equally far-fetched.
While there are certainly some advantages to these techniques – such as people seeing content they actually care about and might engage with – they create potential problems too. There is the risk of inferred data about a user becoming ‘personal data’, even though no-one knows exactly how it was created, including the parties themselves. There is the possibility of automated profiling and targeting of people based on race or sexuality or who knows what else, in a way the parties themselves don’t fully control. There’s the broader problem of messages being selected and targeted on the basis of perceived resonance rather than any wider ideological or political purpose. And all done with great precision and personalisation, making it tricky for regulators to keep on top of it. Recent reporting has focused on allegations of wrong-doing by Vote Leave, but something bigger is going on here. This is why updated regulations, greater oversight and transparency are vital to maintain the integrity of elections and people’s confidence in how the regulator is keeping tabs on it. The ICO report has recommended a refresh of the regulations to bring it up to speed – and ideally before the next general election. Given the speed with which these techniques are developing, I suggest we take that up.