If you have ever been on the receiving end of an eyeroll or yawn from me when I am asked about overseas data transfers, privacy policies, or data processing agreements, let me explain why.
Because you can’t see the forest of privacy harms, if you are focussed on the privacy compliance trees.
To achieve better privacy outcomes for people, privacy needs to be built into product design, not risk-managed via legal ‘busywork’. We should not be relying solely on what tech expert Robin Berjon calls performative compliance, which treats privacy as “a nuisance to be lawyered away”.
Busywork and administrivia
Yet so often, we see tacked-on solutions and legal risk-shifting, instead of privacy being recognised as a core strategic and operational concern, and integral to product and system design.
Here are a few examples of work which keeps privacy professionals busy, without necessarily preventing any privacy harms, or improving anyone’s privacy outcomes.
First, privacy policies. Sure, organisations may need to have one by law, but we know that almost no-one reads them. If you try to read one, you probably won’t understand it. And if you do read one, and you can understand what it says about what that organisation, website, app or product is going to do with your data, and you don’t like what it says, good luck to you finding an equivalent product or service with a better privacy posture. Nice in theory, but lame in practice, transparency alone fails to deliver for consumers operating individually.
Second, user controls and consent management. As a mechanism for protecting privacy, the notice and consent model is broken. It’s unrealistic and unfair, it doesn’t scale, and it doesn’t deliver. As privacy academic Daniel Solove says, a model of privacy regulation based on individual management via notice, consent, consumer choice and user controls – “more buttons, switches, tick boxes and toggles” – is just more homework for consumers. A recent CPRC report, for example, found that Australians would need to spend an average of 30 minutes daily to fully adjust privacy settings on websites and apps, rather than accept default settings.
Third, data localisation. Promising to protect privacy by keeping data in a particular place is another ‘solution’ (often beloved by politicians) that does not necessarily reduce privacy harms. Data held in Australia is not magically protected. Data held overseas is not necessarily less secure. Data location is a poor proxy for whether or not the organisation holding the data is effectively regulated, such as to incentivise them into practising robust data security, and enable remedies for an individual suffering privacy harm. For example, APP 8 in our Privacy Act keeps lawyers busy figuring out how to move data to companies practising robust data security and privacy-preserving practices in other countries, but creates no barrier to disclosing personal information to an unregulated small business in Australia.
Finally, negotiating about data processor versus data controller distinctions in contracts. In our submission on the Privacy Act review, we argued that the introduction of a legal distinction between data controllers and data processors will have significant administrative impacts on regulated entities, while providing little privacy benefit to individuals. It’s more work for lawyers, but no actual privacy protection. (And the compliance burden for understanding and managing that distinction will fall disproportionately on less well-resourced entities, such as non-profits juggling multiple funding contracts.)
Expend more energy on Privacy by Design
Speaking at the CyberCX / Tech Council of Australia Privacy by Design Awards earlier this year, Australian Privacy Commissioner Carly Kind argued that “organisations need to prove they have a social licence, and a significant component of that is about considering and mitigating their role in collective and societal harms”.
So instead of burying privacy pros in legal busywork, regulated entities should be throwing themselves deep into building an organisational culture of privacy by design. Look above and beyond legal compliance, and focus on privacy outcomes. Or in the words of Commissioner Kind, “Don’t be the guys who are just preoccupied with whether you can, think first about whether you should”.
Now more than ever
Large scale and impactful data breaches like Optus, Medibank and Latitude have illustrated why privacy design strategies such as data minimisation are critical to preventing privacy harms. But to achieve collection minimisation you need to be building in privacy-protective serious thinking from the very start of all projects. It’s not a solution that can be tacked on at the end, negotiated into a contract, or promised via a privacy policy.
Designing-in privacy is not only useful for minimising the risk of data breaches. Implementing privacy design strategies can also help meet the expected new ‘fair and reasonable’ requirement.
Commissioner Kind describes this proposed reform as “a fundamental shift in approach”: a proactive obligation on organisations to “take into account the risk of unjustified adverse impact or harm” from their data handling practices. Doing so will require organisations to think about “how new products and offerings can embody fairness and reasonableness right from the start”.
It takes a village … and time
To achieve Privacy by Design, the deep thinking about privacy needs to move beyond the privacy team, to all teams. It also needs to stretch over time, to cover every stage of project management and the product lifecycle, instead of being a box for the compliance folk to tick the week before launch.
Speaking of box-ticking exercises: Privacy Impact Assessment (PIA) is a terrific methodology, which will be key to testing if you have met the ‘fair and reasonable’ test … so long as it is done well. Despite the definition of PIAs from the Privacy Act making clear that they are about measuring and mitigating “the impact that the activity or function might have on the privacy of individuals”, we often see PIAs conducted as if they are simply a compliance check against statutory privacy principles. They test that the organisation conducting an activity will comply with the law, without ever asking what impact the activity will have on individuals. Here are our tips for doing PIAs better.
In addition to making sure you have an effective PIA Framework (BTW we have tips about PIA Frameworks too), Kind encourages organisations to take “a holistic approach … breaking down silos at the structural, operation and leadership levels to advance privacy by design and good privacy outcomes”, until privacy has been “mainstreamed from the board room to the lunchroom”.
Privacy by Design is not a tickbox exercise the week before you go live. It’s not a pretty website promising you take privacy seriously. It’s not tweaking privacy policies, finessing data processing agreements, or building user self-management tools.
Privacy by Design is hard work, but it is work that pays off. Privacy pros need to take time out from dealing with the compliance trees, in order to tackle the forest.
Want to learn more? Join our September small group workshop on PIAs and Privacy by Design, access our webinar on Eight Privacy Design Strategies, or contact our consulting team if you need a PIA or privacy by design advice.
Photograph © Anna Johnston