While ideas like a right to erasure or ‘tell the banks about victims’ might sound great in theory, they won’t deliver actual improvements.
The Optus data breach has elevated discussion of the need to reform the Privacy Act to mainstream political news. While this is gratifying and exciting, I am nervous about law reform becoming a political football. The more politicians and political pundits get involved, the higher the risk that we will suffer knee-jerk legislative responses which sideline privacy expertise.
We need well-considered, useful reforms which lead to genuine privacy protections, not privacy theatre.
Privacy advocates and industry experts were right about the dangers of legislation creating honeypots for hackers by requiring telcos to collect and store identity data. So those with privacy expertise should be listened to now, instead of the political classes blindly coming up with ‘we have to do something’ ideas masquerading as solutions.
The ideas which won’t help
Telling banks about data breaches
The first link between the Optus data breach and Privacy Act reform was made when the Government suggested that it was the fault of the Privacy Act that Optus couldn’t immediately tell the banks exactly whose data had been compromised, so that the banks could protect those customers from harms arising from identity fraud.
There are plenty of reasons to reform the Privacy Act, but allowing – or even compelling – more data sharing won’t help.
First, forget the idea that we are only talking about a few big banks. Any organisation which supplies finance, housing or goods on credit, from a bank to buy-now-pay-later apps to real estate agents to rental car companies, routinely asks for evidence of identity before providing credit. They all rely on the integrity of our key evidence of identity documents, and in turn we, as consumers and potential victims, rely on their business processes and data security.
So while there is an existing system for sharing data with the bigger banks, smaller banks and financial institutions are not members of the Australian Financial Crimes Exchange, which is why the government is reportedly considering requiring companies suffering data breaches to notify financial institutions directly.
Second, think through the practicalities of a company like Optus notifying all financial institutions about all their affected customers.
Option 1 is that Optus hands over a list of all the 9.8 million affected customers to every credit providing organisation and lets them figure out what to do with that data. Now you’ve got tonnes more personal information swilling about multiple organisations, including some whose data security may be no better than Optus’s was. This makes the honeypot problem so much worse, and saddles those organisations with a bigger cybersecurity risk profile to manage.
Option 2 would involve sophisticated multi-party computational data-matching to enable some degree of privacy protection, in order to tell Westpac only about the people who are customers of both Optus and Westpac, and ANZ only about the people who are customers of ANZ, etc, all the way down to the smaller financial sector players. By the time that is sorted out such that the banks (and all the rest) know who were the Optus customers they need to ‘protect’, the organised criminals will have already had a field day.
Third, since the Optus data breach affects such a huge chunk of the population, and because it is by no means the first data breach, and undoubtedly will not be the last, it is not only Optus customers who need protecting from the financial harms which can arise from identity fraud. Just as universal infection control was introduced in healthcare instead of singling out patients with HIV, all credit providers should be practising good evidence of identity hygiene for all customers all the time.
So bulk sharing of customer details after a data breach may not be the best solution, in terms of the kinds of Privacy Act reforms we need.
Let’s move on to some other ideas for privacy protections which are shiny distractions rather than practical solutions.
The right to erasure
The right to erasure sounds good in theory, but in practice who would have the time or energy to exercise such a right?
This is the only point on which I agree with the submission Optus made to the Privacy Act review: the compliance costs for organisations of a right to erasure will outweigh the benefit. (Mind you, by giving us Australia’s worst data breach to date, Optus also just undermined the key reasoning behind their argument, which was that “There is insufficient evidence of a problem which would justify such costs”. Woops.)
There is a problem here, but I don’t believe that an individual-prompted, one-customer-at-a-time, reactive erasure mechanism is the solution.
In particular, why should the onus be on us? Customers shouldn’t have to do the leg work of asking firms to delete their data when it is no longer needed, especially when the law already says that firms must delete their data when it is no longer needed. It’s right there in APP 11.2.
However APP 11.2 only kicks in when an organisation “no longer needs the information for any purpose for which the information may be used or disclosed” (under APPs 6-9). So an organisation might collect personal information for one purpose which is fulfilled in the short-term, but then decide to keep it even longer to use for a secondary purpose. So long as that secondary purpose is lawful – such as marketing, or auditing, or ‘in case a customer complains in the future’ – APP 11.2 says they can continue to store and use it.
What we need is a tougher standard in relation to data retention, which sets a retention period with reference to the original purpose of collection, and then compels proactive data deletion for all customers, instead of a reactive ‘right’ for the very few who have the energy to exercise it.
More notice, consent or choice
Another common suggestion to solve privacy problems is along the lines of ‘just give people more information then get their consent’. Even the ACCC, in its Digital Platforms inquiry, originally recommended beefing up the role of notice and consent as a mechanism to authorise personal information handling, rather than restricting collection, use and disclosure in the first place. This is very much the American model of privacy protection, which is built on the assumption that ‘the market’, full of perfectly informed and rational consumers, will sort things out all by itself.
In this vein let me quote this gem from the submission Optus made to the Privacy Act review, arguing against the need for any strengthening of privacy protections because notice and consent is enough: “In effectively competitive markets, where consumers are informed about the use of their data, consumers are able to choose the company that best matches their preferences. … where companies do not do so, or breach the Privacy Act, they will lose business and customers… It is this market process that provides an effective limit on the behaviour of businesses”.
While this is a lovely economic theory about how competitive markets work, those of us who live in the real world know that ‘competition’ is useless at protecting human rights.
While the recent data breach may well “lose business and customers” for Optus if there is a mass exodus of customers to their rivals, in reality we as consumers have no way of knowing whether the other telcos are any better at collection minimisation, data deletion or data security. And certainly, competitive pressure did not provide sufficient incentive on Optus to prevent this data breach.
Plus, as individuals we often don’t have choice about whether or not our personal information is collected. Sometimes a law requires us to provide some information, in order to get a mobile phone, obtain healthcare, apply for a pension, open a bank account, enter the country, lodge our tax, and so on. And sometimes a business practice is so ubiquitous, and our ability as an individual to say ‘no’ to a platform is so constrained, that our ‘choice’ is only theoretical.
As a mechanism for protecting privacy, the notice and consent model is broken. It’s unrealistic and unfair, it doesn’t scale, and it doesn’t deliver. As privacy academic Daniel Solove says, a model of privacy regulation based on individual management via notice, consent, consumer choice and user controls – “more buttons, switches, tick boxes and toggles” – is just more homework for us.
Promising to protect privacy by keeping data in Australia is another shiny ‘look at me!’ idea that is next to useless in reality.
While it must be tempting as a politician to pander to the mildly xenophobic tendencies of the voting public, data held in Australia is not magically protected. Data held overseas is not necessarily less secure. So data localisation is not something you often see privacy advocates asking for.
Yet one of the ‘privacy protections’ purportedly ‘won’ by Labor when in Opposition, as the Data Availability and Transparency Bill was being debated in Parliament earlier this year, is a prohibition under s.16A(2) on “storing or accessing, or providing access to” the shared data, “outside Australia”. Unlike under the Privacy Act, there is no exception for organisations or jurisdictions offering equivalent privacy protection, or for data storage not involving a disclosure to a third party.
The end result? Use of cloud storage facilities which offer stronger security than keeping data on a USB stick will be prohibited. Plus, collaboration with international researchers or institutions will not be possible. What probably sounded great in theory as a privacy protection does not match what is needed in reality.
Do you know any privacy or cybersecurity experts who think this is a good idea? No, me neither.
Prohibiting re-identification is another shiny-yet-terrible idea which has unfortunately now made its way into the DAT Act, after being repeatedly rejected in other draft laws since it was first mooted in 2016.
And the version in the DAT Act is even worse than the version first proposed in 2016 after security researchers discovered that the government’s release of MBS data was re-identifiable. At least in the original version, the offence would only have applied to data released publicly, by agencies, in the belief that it was not re-identifiable. In the DAT Act, the provision is blanket: under s.16A(3), if data that has been de-identified is shared between two organisations, the recipient will be prohibited from “taking any action that may have the result that the data ceases to be de-identified”.
The effect is that not only will security testing of re-identification risk be prohibited, but so too will the use of de-identification techniques such as encoding or pseudonymisation, which is routinely used as a data security measure for protecting data in transit from A to B, prior to data linkage.
If B is prohibited from re-identifying the data upon receipt, some of the use cases the DAT Act is intended to support, such as the provision of emergency support services to people affected by natural disasters (who need to be identifiable in order to receive the support), will not be possible if any de-identification technique had previously been used to protect the data in transit. Since A and B will only be able to share data under the DAT Act if they don’t de-identify data first, the result is a perverse disincentive to use de-identification at all. This could result in a lower standard of privacy protection, instead of an improvement.
Again, this was not a ‘privacy protection’ that the privacy community lobbied for, in our opposition to the DAT Act. It was added late to the Bill, without consultation. Only after the Bill had passed could we see the amendments, by which point it was too late to point out their flaws.
It was raised again as a proposal in the context of the Privacy Act review, but fingers crossed it won’t feature in the final version of the recommendations from the Attorney General’s Department.
8 Privacy Act reforms which would help
OK, now that I have hopefully convinced the government to listen to privacy experts before getting out the legislative drafting pen, here are my top eight suggestions for privacy law reforms which will help protect Australians’ privacy.
I will keep this short, because we have already written extensively about most of these ideas in our submissions to the Privacy Act review.
- If anything the Optus data breach confirms the need to look after our government-issued evidence of identity documents and unique identifiers. The Identifiers principle (APP 9) should prohibit the collection and storage of government related identifiers (passport numbers, driver licence numbers etc), unless the collection or retention is required by law. Currently, APP 9 only limits adoption, use or disclosure. Clearly, we need a greater incentive for organisations to practice collection minimisation and data deletion.
- Speaking of data deletion, as mentioned above, the Data Retention principle (APP 11.2) should set data retention periods with reference to the original purpose of collection, not ‘any’ subsequent purpose for which the data might be used.
- Introduce a direct right of action which enables people complaining of a breach of one or more APPs to lodge a case in an accessible, ‘no cost’ tribunal, without needing the OAIC to be involved.
- Significantly increase the maximum fines associated with breaches of the APPs, and enable the OAIC to levy them without the Federal Court being involved. Putting penalties for privacy law breaches on par with competition law breaches would certainly focus the minds of senior executives.
- Impose stricter limits on collection, use and disclosure. For example, the ‘related secondary purpose’ test for use and disclosure needs tightening, and as the Attorney General himself has suggested, the collection rules also need fixing to crack down on businesses commoditising personal information.
- Build in an overarching ‘fair and reasonable’ requirement for all forms of collection, use and disclosure, which applies whether or not the individual’s ‘consent’ has been obtained for the use case.
- Clarify the definition of personal information to unambiguously include inferred and observed data, as well as unique identifiers and other techniques used to distinguish and act upon individuals
- Bring small businesses into the fold. Turnover is not an indicator of the level of privacy risk posed by a business. But small businesses will need a helping hand from the regulator in order to get up to speed.
Which brings me to my next point.
Improving privacy protections for Australians takes more than legislative reform. The Australian Government also needs to fund the OAIC properly. The privacy regulator has been shamefully under-funded and hamstrung for decades. If the Australian Government is serious about dragging the Privacy Act into shape, the regulator needs teeth. And to have teeth, regulators need big budgets. Big budgets are also needed so the regulator can be a guide dog for all the organisations wanting to understand and apply the law, not just a watchdog for the recalcitrant.
And then what?
Finally, what happens after the law is reformed?
Passing legislation to reform the Privacy Act is the easy part. What happens next is up to companies.
Preventing the next big data breach is not just about beefing up your cybersecurity team. As the Chair of ASIC noted in the wake of the Optus data breach, the management of data-related risks can’t be left to the IT department alone.
The best way to lower your organisational data security risk profile is to collect and store less personal information in the first place. Companies need to take a privacy-first approach to decisions about what personal information to collect (including whether, why, when and how), as well as limiting data storage and secondary use.
Privacy and data security responsibility needs to work top-down from the Board of Directors and the C-suite, as well as bottom-up from product and system development teams. Privacy officers should be training teams to understand and implement ‘Privacy by Design’ as something more than just a slogan, and to take a more disciplined approach to collection minimisation in order to reflect consumer wishes, achieve privacy compliance, and lower the cyber security risk profile for their companies.
Then we can move beyond privacy theatre, to real privacy protections.
Photograph © Izzy Park on Unsplash