Legislation on data collection and privacy needs to do more than protect individual privacy – it must also guard against abuses of large amounts of profiled data, which is where the real value is.
The proposed Australian Consumer Privacy Act basically says that people’s data cannot be shared unless they agree to it. In IDC’s US/China Smart Home Consumer Survey in 2017, over 5,000 middle class, urban individuals were asked about what they would share and how they would share it. The responses were interesting.
Responding to the question about whether they would “share more information about myself (what I am doing, what I like, where I am, etc.) if it resulted in a product or service experience that was better optimized for my personal needs”, most people were happy to share in exchange for immediate gain:
- Over 55% said they would be okay if the outcome was a better product or service
- Similarly, 41% said they would share information if it got them discounts or a price advantage
- Only 30%said they would be concerned about how the data was shared and if it was shared.
The problem arises when someone wants something and is offered an improved product, service or price advantage in exchange for some data. Unless there is some legislation around single use of the data, the giving up of data means it is given up for some period of time. Protecting against future use is the challenge here – a challenge that GDPR and similar legislation are trying to address.
In the same survey as noted above, over 40% said they would give up their name, address, photos, some personal data and activity data for sharing.
We are concerned about data privacy and security only when the assumption of trust is broken – i.e. there is a breach. When it serves personal interest, we give up something which has a low perceived personal value. Legislating around the closing of the door after the horse has bolted may well be an acceptable risk given the value of that data that might be derived from using it en masse at the right time.
Legislation can restrict the amount or type of data which is shared. This restricts how ‘personal’ the data is. However, the major value of data is in its ability to describe and predict the behavior of groups – what we collectively care about, how we respond to messages, what we collectively fear and love. Though GPDR and similar legislation may make it harder to identify a sentiment with ‘me’, it will be much harder to legislate for the abuse of large amounts of profiled data.
That’s the challenge for the digital future.
Written by Hugh Ujhazy, associate vice president of IOT & Telecoms at IDC | Originally published on LinkedIn