Much of the work we do here at Salinger Privacy involves Privacy Impact Assessment of new projects. One of the things I love about PIAs is that they’re not just about ticking off legal compliance – they need to consider community or stakeholder expectations about the project as well. But how do you test for community expectations? Most PIAs don’t have the time or budget to commission specific research.
So it is that every few years, I await with bated breath the results of the OAIC’s latest Community Attitudes Survey. It’s like a little goldmine of stats, that anyone can use. For instance in this year’s results, released earlier this month, I found this nugget: while 86% of people see secondary use of their personal information as ‘misuse’, that number falls if the purpose of the secondary use is for research or policy-making purposes, in which case only 40% continue to be uncomfortable with the idea. (A quick digression: actually, 40% of nay-sayers seems quite high, don’t you think? This sizeable minority could pose a problem if your project is all about Big Data or data analytics for government policy-making.)
There are also insights into the extent to which Australians trust different service providers to handle their personal information. Health service providers are at the top of the tree, and the social media industry is at the bottom. And in news that should surprise no-one, trust declines with age, with older people more likely to question why they should hand over their personal information at all. In other words, middle-aged crankiness is a thing. (Yeah! Sing it with me, fellow Gen-Xers, we’re not gonna take it!)
But can we trust these stats about, well, trust? How accurate are they? How well do they reflect community attitudes towards privacy issues, when privacy is such a personal value?
Former UN statistician and data journalist Mona Chalabi has said that using opinion polling to predict how people will behave is “about as accurate as using the moon to predict hospital admissions”. She notes that society is very diverse, and so it is hard to get a representative sample of the population for survey-based statistics; people are reluctant to answer their phone to pollsters; and of course, some people lie.
(In fact, lying is one of the topics surveyed. The 2013 OAIC survey results suggested that around 30% of people lie or give misinformation in order to protect their privacy when using websites or smartphone apps, up from 25% when the question was asked in 2007. In 2017, the figure looks like 26% if you count people who said they provide false personal details ‘always, often or sometimes’; but 46% if you add in ‘rarely’. Meanwhile, the Productivity Commission says that ACMA says the figure is 47%. So who – or which figure – to believe?)
Some demographic factors appear to influence an individual’s attitude towards privacy, including age, gender, ethnicity and socio-economic status. But wouldn’t you expect that when averaged out across populations, that Australian and New Zealand attitudes towards privacy risks to be roughly the same? Compare these stats about what apparently bothers people the most in our two countries, and think about how surveys can give the wrong impression if not analysed carefully.
A survey by the New Zealand Privacy Commissioner in 2016 asked what people found most ‘sensitive’. It found that a large majority of respondents (80%) were sensitive about the content of personal phone conversations or email messages, and a smaller majority of New Zealanders were sensitive about personal earnings (66%), health information (65%), physical location (63%) and websites visited (54%). Respondents were less sensitive to purchasing habits (42%), birth date (39%) and political and religious views (38%) and (31%) respectively.
So you might conclude that Kiwis are most concerned about protecting the privacy of what they write or say, but also quite concerned about the privacy of their location and what they do online. Yet none of these topics feature in the latest Australian OAIC survey results at all.
Should we conclude that Aussies are happy to be spied on, and have both our physical movements and our online habits tracked? Now that we have mandatory data retention with warrantless law enforcement access to our metadata, have Aussies concluded that everything is tickety-boo, because privacy doesn’t matter anymore? No of course not. The explanation comes not from some deep exploration of cultural differences between Australia and New Zealand, but in the way the survey questions are framed.
Rather than asking about sensitivity of particular issues, the OAIC 2017 Australian survey asked people to nominate the types of information they are reluctant to provide to businesses and government. Phrased this way, not surprisingly issues related to surveillance, monitoring or profiling don’t rate at all. Instead, we see financial status, contact information, date of birth, identity documents and health information being nominated most often.
Either way, when you consider either the New Zealand or the Australian surveys, what the law treats as ‘sensitive information’ and thus worthy of additional legal protection is not necessarily reflective of what the public see as most harmful. For example, geolocation data and web browsing history are not categories protected to the higher standard by law, but more people are concerned about those types of information being collected or used about them, than they are about their ethnicity, sexuality or religion.
Australian Privacy Commissioner Timothy Pilgrim, in his speech launching the 2017 survey results during Privacy Awareness Week, noted that community expectations about how the law works often don’t reflect reality. As an example, the majority of respondents believed that the Australian Privacy Act regulates various types of entities that in fact are exempt, such as media organisations (69% thought they were regulated), political parties (64%), and small businesses (66%). Noting that we need to ‘close the gap’ between the law and the community’s expectations of how the law works, he posed the question: in which direction should we move? In other words, should we make the law reflect community expectations about how our privacy should be protected, or just educate people better about the gaps in the law?
So, we have a gap between community expectations and the law. But how about people’s expectations and their own behaviour: surely this at least is consistent? Ah, no.
Known as the privacy paradox, there is a disconnect between how worried about their online privacy people say they are, and the steps they actually take to protect themselves online. There are psychological reasons for this paradox which is shared with other disconnected decision-making (exhibit A: smoking), but they tend to be ignored in favour of a conclusion that actually, privacy doesn’t really matter.
In other words, the privacy paradox is used to justify myriad privacy-invasive practices on the basis of a claim that usually goes along the lines of ‘oh no-one really cares about privacy, they all put everything on Facebook these days anyway’. Not only is this sloppy analysis, but it allows for the social media giants (who, let’s remember, are at the bottom of the trust scale) to promote their profit-making philosophy that privacy is dead, get over it.
I find this type of thinking has unfortunately coloured the latest Productivity Commission inquiry, into data availability and use.
Across both the draft report and the final report, the Productivity Commission has extolled all the virtues of greater use of existing datasets to further research and resolve public policy problems. I did notice a slight shift in tone from draft to final. While the draft had an undercurrent tone suggesting that nobody but the Privacy Commissioner and a few privacy advocates care about privacy, and that most Australians were quite willing to share their data if only they could, the final report readily acknowledges that lack of trust, rather than privacy laws per se, is what holds back many data-use projects; and that privacy risks “should not be downplayed or trivialised”.
In the final report, finding 3.1 says: “Individuals are likely to be more willing to allow data about themselves to be used by private and public organisations, provided they understand why and how the data is being used, can see tangible benefits, and have control over who the data is shared with”.
Now just pause for a moment to think about what this is actually saying. These three elements are actually quite hard to achieve. There is a really high bar set here, that a project or an organisation needs to clear, if it is going to engender the kind of public trust necessary to allow data-sharing to occur. In other words, the Productivity Commission has formulated Australian community attitudes towards data-sharing as extraordinarily privacy-protective.
The Productivity Commission also found that “community trust and acceptance will be vital for the implementation of any reforms to Australia’s data infrastructure”. There is much talk of engendering the kind of social licence necessary for public acceptance of data-sharing. But how to get this social licence?
The Productivity Commission has explicitly rejected the option of better enabling privacy rights, whether through minor law reform (such as by fleshing out the access right in APP 12 to include the right to receive one’s personal information in machine-readable form, aka data portability), or through building decision-assistance tools to guide organisations to make better and faster decisions on releasing data under existing privacy rules, as I suggested when I appeared before the Commission.
Instead, the Productivity Commission has recommended the creation of new set of legal consumer-based access-to-data rights, along with a complex and bureaucratic system of industry-developed data-specification agreements, overseen by the ACCC.
As others have noted, these recommendations might aim to make our “complex data landscape simpler”, but “the desire to simplify in practice only makes the data landscape more complicated”.
And none of these proposed consumer rights actually offer “control over who the data is shared with”. In fact, people having control over who their personal information is shared with runs directly counter to the other recommendations made by the Productivity Commission, which suggests that a new piece of legislation should sweep away all existing privacy and secrecy barriers – even those in State and Territory laws – to promote the sharing of data in the national interest. (To be overseen by a National Data Custodian, helped along by Accredited Release Authorities, who can decide what data gets released to whom.)
I don’t believe that social licence for greater data-sharing – and let’s face it, we are talking here about data-sharing for unrelated secondary purposes, without the subject’s consent, not already authorised under a research exemption – can be built by sweeping away existing privacy and secrecy protections. Even the proposed new National Data Custodian will need to make some kind of case-by-case assessment, based on a mix of legal and ethical review, common sense, knowing the customer base, avoiding the creepy, and maybe a bit of intuition as well. The same goes for your projects too.
Because even if you comply with the law, a backlash from your customers or the wider public can bring your project undone faster than you can say ‘Australia Card’. As the Productivity Commission warns: “It can be difficult for a data holder to know if they have community support for use of data; but they will almost certainly know if they do not”.
Photograph (c) Shutterstock