
Flipping data privacy on its head
We have all been creeped-out at some point by an ad for a holiday, car or something else that we’ve not even discussed outside the privacy of our own home. We’ve all felt like advertisers know more about us than they should and worried about why search engines are able to anticipate exactly what it is we’re looking for (while at the same time enjoying the convenience).
Many governments are considering ways of helping us to control what information we have out in the ether of the Internet. Some jurisdictions, such as the European Union, are going further and pushing for transparency of how algorithms use your data (see the proposed Data Services Act). Others are creating data sharing rights for consumers, such as the Consumer Data Right in Australia and open banking initiatives in the UK and Canada.
Customer expectations are also starting to influence the market, with more platforms including explanations on their recommendation engines with headings such as “because you searched for”. Similarly, Google, with their Chrome browser dominance, is looking to move beyond cookies and put users into cohorts to serve up ads rather than allow advertisers to directly target each of us individually. At the same time, the privacy wars have led Apple to move more control over the collection of data by mobile apps into the hands of users.
These initiatives are important, but they fail to recognise that data, and its use, are now more complicated than ever before in the world of AI. Not only is our data collected and stored as part of profiling it is also used to train algorithms. While we can remove data that has been stored about us, the imprint of our data can live on in ways that are hard to reverse or even detect.
Some argue that such privacy initiatives are attempting to hold back an inevitable trend in business and society. An alternative approach is to work with, rather than against, the industry. The technology platforms, in all their forms from search and social media to retail and delivery, have three goals: 1) to provide a service to their users; 2) to earn profitable revenue from their service; and 3) to create some sort of barrier to entry for competitors. Any attempt to regulate that doesn’t recognise all three of these goals will fail, act as an impediment to new competitors or both.
1. Service to their users
A few years ago, I compared technology to alcohol when I wrote about the paradox of the beneficial and detrimental effects of the integration of the technology platforms into every part of our lives (see If by whiskey). There is a real risk that over-regulation, while well intentioned, could fall to the same fate as attempts at prohibition in the early twentieth century.
2. Earn profitable revenue
To enjoy the benefits of the digital services we’ve come to rely on, we need these platforms to be profitable in every service that they offer. When any one service requires cross-subsidisation, there is a lack of competition and the temptation to extend the use of data beyond its original purpose can be commercially irresistible.
3. Barrier to entry for competitors
Ongoing improvements in everything from retail to digital health requires investment. It’s in all our interests that companies can monetise and protect their investments through patents, secrecy or a head start in the market. This needs to be balanced against the need for a vibrant and competitive digital economy.
The early utopian dreams of a completely free Internet have proven no truer than in any other part of society. We must find the balance between privacy, freedom and the interests of our communities, economy and each workplace. I was privileged to talk about some of the issues in these last two cohorts recently as part of The CIO Australia Show podcast with host David Binning and fellow guests Nicki Doble (the CIO of the Cover-More Group) and Jim Stanford (economist and Director at the Centre for Future Work).
The key to finding the balance is transparency. To achieve the third goal, a barrier to entry for competitors, most of the algorithms we touch every day are opaque. We don’t know how the data that’s been collected about us is turned into the information that we’re fed.
KFC long established mystery as a barrier to entry with “The Colonel’s 11 secret herbs and spices”. However, the level of secrecy around today’s algorithms would be akin to KFC hiding whether we’re eating chicken or fish! Arguably the inconsistency of the data served up based on personal behaviour is just as much of a problem. The social impact of news feeds and search results being tailored to what people want to see has been well documented.
Perhaps the right answer is for us to be able to be treated as an individual, part of a cohort and sometimes simply receive anonymous treatment as part of the crowd. If regulation can give us at least that freedom then is that any different to the rest of real world?