D/SRUPTION speaks with Nicholas Oliver, founder of people.io who explains why privacy is key…
It seems that consumers now have a greater awareness of the consequences of data use, but there is still a lack of understanding over the value of that information. The language of data privacy has, until recently, been dictated by companies.
Nicholas Oliver founded people.io in 2015, after realising that data ownership was not being understood as a significant issue. Convinced that innovation would be held back by the availability of data rather than financial capital, he set up his company in order to change that language, and allow consumers to unlock the true potential of their data.
The ‘Visa of data’
“What we’re really creating is the Visa of data,” says Oliver. “Visa is a transaction layer that sits on top of your money. Visa don’t hold your money, they simply enable you to send something from your bank account to a brand. You know exactly what Visa does. That, in the long term, is what people.io will become.”
Once a user signs up and downloads the people.io app, they are shown a consent screen which assures them that any data they create in the app will be owned by them. It is never sold to a third party, and if the user wants to leave, all of their data is hard deleted. Within the app, users answer a series of questions and are rewarded with credits. Through the accumulation of credits, users can unlock value in the form of improved services, hard products or other perks.
Eventually, explains Oliver, you could walk into an electrical store to buy a fridge, and find one with a people.io sticker. You would then know that the fridge will be able to connect with your people.io account to order the food you want based on your data, reducing your effort and improving your quality of life – without actually handing over any information.
An asset like any other?
In light of recent scandals over the use of data by big tech companies, many people are beginning to question the extent and value of data privacy. This has, in part, forced businesses to change the way they handle consumer data, as have relatively new regulation such as GDPR. But how far do people care about owning their data, and what does data ownership really mean?
“It’s a common point of view that if someone gets a better service, they’ll be more willing to share their data – as is the argument that ‘I’ve done nothing wrong, so why am I bothered about people seeing my information?’ But their definition of what is wrong may change over time,” Oliver says. “Something that wasn’t illegal a few years ago will still be on record in a data form, so you can prove that somebody has done it. But, more importantly, what happens when you share photos on Facebook of you drinking, for example? All you’ve done is share a photo, but if Facebook starts doing loans or offering insurance, that passive image can actually cause you to suffer a consequence.”
Oliver explains that thanks to machine learning, data can be deleted but still have a presence. Intelligent algorithms don’t need the source data itself, and regulations have yet to catch up. This is as much a concern for the average person as for an organisation. The answer, it seems, is awareness.
A universal issue
Today, there is a view that so long as your data remains unused, it is yours. But as soon as it comes into contact with a third party, there is a sharing of rights, and it is no longer perceived as your own. So, should data be viewed in the same way as anything else that we own?
“People should consider data as an asset to the extent that it holds value, but also as a part of them. Your data in isolation has no value – it’s only when it is compared to someone else’s that is has value,” says Oliver. “It is an asset, but in a different market place. You need to create a multi marketplace dimension in order to determine its value.”
Oliver goes on to explain that there is no real way to stop data being collected. Consumers have no choice but to accept that in using any product or service, they sacrifice a level of control. However, this isn’t necessarily a bad thing.
“If you’re using Uber, for example, you will probably know that Uber tracks you everywhere you go. You still use it, because it’s cheaper. There’s enough benefit. We’re not going to stand here and say stop using certain things because there’s no point. What we say is, be conscious of the value you get for things.”
Now that data has become a commodity, is it possible to ensure a level of data privacy? Oliver explains that, in a world of data driven machine learning, the very definition of privacy will change.
“In many ways, the current definition of privacy is to stop people from analysing what you’ve done before to work out who you are today, and what you might do tomorrow. But with machine learning, if you’ve trained a network to detect certain patterns, it doesn’t care what you’ve done before. It only cares about what you do today to determine what happens tomorrow, so the old definition falls by the wayside.
Privacy is no longer about being surveilled – it’s about how you stop the intelligence from having a consequence over your life.”
To hear more from Nicholas Oliver, register for our upcoming Building Business On Behaviours event here.