WHY THIS MATTERS IN BRIEF
The future of personalisation is intertwined with the future of privacy, and as we head towards an Algorithmic Society I ask if it’s time to take back control of our data.
Today it can be argued that customer experience is more important than ever, and as market incumbents increasingly find themselves, and their value chains, under ceaseless attack from hoards of new digital startups from around the world it’s become a battleground, so it was with great delight that I was asked to present the “Future of Customer Experience,” and then appear on a subsequent panel, at Marketforce’s 20:20 Customer Experience Summit in London.
As many of my readers will know customer experience, more specifically privacy, lest we forget they’re related, is a subject that’s close to my heart because as multi-nationals look for new ways to personalise their adverts and services inevitably there’s only one way they can do it, and that’s by understanding all there is to know about us, which means only one thing – they increasingly track and analyse our digital footprints and shadows in order to create highly detailed individual profiles which are then rolled up into demographic groups, and so on and so on.
During the summit we heard from a wealth of large influential companies, who are all keen for our data, from Argos and Barclays to Channel 4 and beyond, and, as expected the views their executives painted of data collection, under the auspices of personalisation was a rosy one – the collection and analysis of personal information in the pursuit to deliver better tailored services and individual consumer experiences. And in the main it’s fair to say everyone in the audience, albeit sometimes with a begrudging shrug, accepted the pros of having their online, and in some cases offline, activities tracked and analysed, but it was evident that there’s a sliding scale of comfort and ease with the process.
Let’s say, for example, you’re using an online entertainment service, for example the BBC iPlayer, Channel 4 On Demand, Netflix or YouTube, to watch television programs. In the main we all recognise that all those services employ increasingly complex algorithms to analyse our online behaviours in order, in their case to create personalised playlists, and I think it’s fair to say that most of us are okay with that, we still don’t like being “stalked” as some people put it, but it’s part and parcel of the service we receive, and the worst thing that could happen in this instance is we get shown a list of programs we don’t like and we flick to the next playlist. It’s a relatively benign use of our personal data.
Now let’s start flip to another example. You’re applying for an online credit card, loan, or mortgage. The same process applies, the companies you go to trawl and analyse your digital footprints and shadows, over 450 data points in all, and in some cases those of your friends and relatives as well, this is a process called “n+1” analysis and few people realise it’s a growing phenomenon, and their algorithms come to a decision. Congratulations you’re a home owner with a car and a credit card.
These are both good examples of how today we’re increasingly reliant on increasingly complicated and opaque algorithms to profile us, and model and categorise our behaviours into personality types that, like some game, help unlock, or give us access to particular products and services, and both of these are examples of what is increasingly being referred to as the “Algorithmic Society,” something that can be both a good and a bad thing depending on your experiences, and your point of view.
For example, when everything goes your way, and you’re given a playlist you love, or a mortgage you needed the Algorithmic Society is fine, maybe not great, but fine. However, as the amount of information about us all swells, the majority of it uncontrolled, unfiltered and some of it plain inaccurate, and as more data gets swept into these algorithmic nets, all of a sudden we can find that it’s not as benign as we think, and suddenly we can quickly find our lives taking a turn, sometimes for the worse.
Let’s say, for example, you’d been turned down for that credit card, loan or mortgage. On the one hand you’re going to feel disappointed, but on the other hand, like millions of other people, you’re also going to want to know why you were turned down. Looking into the recourse available to you you decide to pick up the phone to the company’s customer services department and ask them, but all you get is the standard company line, “The system takes a multitude of factors into account and in this case, we’re sorry, we couldn’t approve you.”
Does that sound familiar?
And what are these mysterious “factors,” and what information did the company’s opaque algorithm take into account to come to its “wholly unfair” decision? And if you can’t find out the reason for the denial then it’s unlikely you can fix it, and it becomes a circular problem, that, if you’re unlucky might start haunting you wherever you go. The result? You feel deflated, and “beaten” by “The system.”
The machines say “No.”
In this particular case though there could be a multitude of reasons why your application was declined. For example, you might have a “blip” in your credit report that you didn’t know about, or the algorithm might have trawled your Facebook wall and saw you burning cash, or your identity might have been cloned and used for nefarious purposes on the DarkNet.
The fact is, because none of these algorithms are transparent, a problem that’s increasingly being called out by customers and regulators alike, and because very few people, even their designers can describe how these “algorithmic black boxes” work noone can tell you why the machines said “No.”
The Algorithmic Society that used to be your friend is now your frenemy, or worse, and, furthermore, because you don’t have way to identify, control and reign in your digital footprints and shadows you have no way to prevent the same situation from happening again and again.
As I approached the tail end of my presentation I asked the audience how many were comfortable with the way companies track them and analyse their behaviours, and the answer was unanimous. Noone was happy with it. And it’s the same story wherever I go and whichever country I’m in. Furthermore, and perhaps personal privacy is the looming crisis of our time, that one day will prompt companies to sell it as a service, everyone felt powerless to stop it, or even influence it, and as companies increasingly tie together our online and offline activities and identities we have to ask the question will we ever be in control of our own data?
Well, surprisingly, the answer is yes, we could be, but as ever it’s down to us, individually and collectively, to take back control.
Today, in part thanks to the new power of blockchain, a distributed ledger technology that’s transforming how the world works, we’re starting to see the rise of “Sovereign ID” platforms that allow us to do precisely that – capture and control our own information. Furthermore, and as if that wasn’t good enough, these same platforms create transparent, real time audit trails that tell us how, when and by whom our information is being used, and, if we’re unhappy with something we sell, well, we can just turn that provider “off.”
Imagine the power you’d feel if you had control over your own data, and now imagine going one step further and being able to monetise it – how the tables could turn…