Given some of the breathless media coverage about the proposed reforms to the Privacy Act, you could be forgiven for thinking that the government is about to embark on a radical transformation of our privacy laws, to the benefit of consumers and detriment of business. However the devil, as they say, is in the details. And when it comes to the Privacy Act review, the details in the final set of proposals are… surprising.
The Final Report from the Attorney-General’s Department into their review of the Privacy Act was published last month, although just how ‘final’ it is is debatable, given we have until 31 March to make yet another submission. And at 320 pages long, and with 116 proposals for reform, there is a lot to digest.
Plenty has already been written by way of summarising the proposals, focussing on things like the introduction of a ‘fair and reasonable’ test, mandatory Privacy Impact Assessments for high-risk activities, and a new right to erasure.
Indeed we have our own guide to how the proposed reforms will change the status quo, plus a webinar to explore their implications.
But a number of those reform proposals were very much as expected, having been flagged in the 2021 Discussion Paper.
So for a fresh take on the proposed reforms, today we have teased out for you some of the things we found most surprising in the Final Report.
In particular, we were surprised to find a disturbing amount of accommodation of the data-extractive business practices which Australians don’t want; a doubling-down on the idea that consent can be ‘forced’ via terms of service, contrary to the ACCC’s exposure of digital platform processes which kicked off this whole review in the first place; and the introduction of unnecessary complexities which will confuse businesses and consumers alike, without improving privacy outcomes.
Let’s tease out some of these surprising features from the Final Report.
Legislative whack-a-mole
One of the strengths of the Privacy Act is the flexibility offered by its technology neutrality and principles-based nature. According to the Final Report, a common theme in submissions to the review was support for the current design.
So it is surprising to see so many proposals which could undermine the neutrality and flexibility of the Privacy Act, by proposing special rules for particular technologies, sectors or activities.
In order to illustrate why foundational concepts like the definitions of personal information and consent need updating, or why the privacy principles need strengthening, many submissions over the course of the review would have pointed to particular business practices or technologies to illustrate how privacy harms might arise. Examples include facial recognition, location data, marketing to children, and online behavioural tracking.
But instead of taking these issues du jour as illustrative of broad problems requiring tech- and industry-neutral solutions, the Final Report treats them as separate issues to be dealt with one-by-one. A number of practices are called out individually and dealt with by creating bespoke individual rights, prohibitions, or other special rules.
An example is how the deeply problematic (and clearly unwanted by Australians) practice of online tracking, data-matching and building of profiles about individuals, by linking data across different companies, devices and platforms, using identifiers or pseudonyms such as hashed email addresses, which enable individuals to be singled out and targeted, even if they cannot be identified.
Instead of dealing with these problems by clearly bringing individuation within the scope of the definition of personal information, and letting the standard privacy principles regulating collection, use and disclosure deal with what is lawful and what is not in relation to those use cases, the Final Report proposes to tackle these problems by establishing a whole separate set of rules relating to some use cases.
And then ultimately the Final Report only addresses the most visible end point of the corporate surveillance business model, with a proposal to do nothing, but support users’ right to ‘opt out’ of seeing targeted advertising, “consistent with current industry practice” which allows users to ‘opt out’ via their account settings on each separate social media platform.
So the Final Report actually backs in current business models which are built on data extraction and exploitation, saying that the non-reform position on this issue is because they “recognise the potential impact of any reforms on ad-supported platforms”.
Let that sink in for a moment. We have an Attorney-General who says that “for too long, we’ve had companies solely looking at data as an asset that they can use commercially”. At the IAPP ANZ Summit last year, in a pre-recorded address, Mark Dreyfus also said that the power held by digital platforms underscores the need for reforms; personal data should “not be treated as a “corporate asset”; and information is being used in ways that harm and manipulate, and do not meet community expectations, including online behavioural advertising targeting of children in the vulnerable, and the use of facial recognition. He committed to bringing forward reforms to build a strong regulatory framework that is “fit for purpose in the digital age”, including the scope of the type of information covered by the Act.
Yet we have a report from his Department saying that they “recognise the potential impact of any reforms on ad-supported platforms”, which is why they propose to do… very little.
The proposals in the Final Report leave untouched the building or sharing of profiles about individuals, so all the stuff behind the scenes, like building ‘lookalike’ audiences, gets a free pass. And as we have said before, the harms from online tracking, profiling and targeting at an individuated level is not just about seeing ads. Ultimately the effect of the Chapter 20 proposals is to build a complex legislative structure which then allows individual consumers to choose not to see the fact that the horse has already bolted.
The other problem with drafting specific rules for specific use cases is that they are likely to become obsolete by the time the ink is dry on the amendments. (If technology-specific rules had been included when the Act was drafted in 1988, the privacy principles would today be vigorously protecting us from the harms caused by fax numbers, and information stored on floppy disks.)
Too much specificity in identifying particular harms to be legislated against (like ‘seeing’ targeted ads) also encourages the wrong type of innovation, because companies can re-structure their practices to evade capture by provisions which relate only to certain use cases. Tying compliance requirements to specific practices, rather than the APPs more generally, means that if a company can argue that its business practice falls outside of the definition, it no longer carries any protections afforded by the Privacy Act. It would be much better to preserve that principles-basis within the Act, to ensure that organisations must meet compliance obligations regardless of the purposes for which they handle individuals’ information.
These proposals will doom the Act to be designed for the past, instead of the future. Rather than play legislative ‘whack-a-mole’ with harms after they arise, the only way to ensure the Act remains relevant to emerging risks is to keep it technologically neutral and principles-based. A key component of this will be to recognise ‘individuation’ within the definition of personal information.
One step forward, two steps back
Speaking of the definition of personal information…
It’s been a long and winding road from the Digital Platforms Inquiry (DPI), but it is worth remembering that the review of the Privacy Act was kicked off by the Australian Competition and Consumer Commission (ACCC), with the release in 2019 of their DPI final report.
As part of the Government’s response to the DPI final report, in 2019 the Government supported in-principle a number of specific reforms, including:
- updating the definition of ‘personal information’ to capture technical data and other online identifiers (Recommendation 16(a) from the ACCC), and
- strengthening consent requirements and pro-consumer defaults (Recommendation 16(c)).
So it is disappointing to see the proposed definitions for personal information and consent from the 2021 Discussion Paper get watered down, and even further undermined in some places.
There is merit in some of the proposals to add context to the definition of ‘personal information’, such as by clarifying that inferred data is in scope and that the act of drawing inferences or generating new insights from existing data is a fresh ‘collection’.
However there has also been a watering down of the earlier proposal to more clearly explain what is within scope of the definition.
In 2021 the Discussion Paper recommended that the Act be amended to “(d)efine ‘reasonably identifiable’ to cover circumstances in which an individual could be identified, directly or indirectly”. It also stated that: “The definition would cover circumstances in which an individual is distinguished from others or has a profile associated with a pseudonym or identifier, despite not being named.”
The 2021 proposal was not perfect, but it would have helped clarify the law for genuinely confused organisations and consumers alike. It would have also gone some way towards stamping out industry arguments that practices like facial detection, or non-cookie based targeted advertising do not collect or use ‘personal information’ (and thus do not come within the scope of privacy laws) because no-one can be “reasonably identified” from the data they handle. In particular, it would have helped deal with the disingenuous claims about ‘anonymous’ data made to consumers by media and AdTech companies, especially when compared with what they tell brands about their cross-brand data-matching, online tracking, profiling and ‘addressable’ targeting capabilities.
However the Final Report has scrapped that idea. Instead, Proposal 4.4 simply suggests that “‘(r)easonably identifiable’ should be supported by a non-exhaustive list of circumstances to which APP entities will be expected to have regard in their assessment.”
So while noting without challenge the OAIC’s formulation over many years of what can make someone reasonably identifiable – that an individual is ‘reasonably identifiable’ if the person can be uniquely distinguished from all other people in a group, even if their ‘identity’ is not known – the Final Report does not propose building this test into the wording of the Act itself.
The final version of the proposal shifts too much compliance burden onto organisations to ‘do their own assessment’ of what the definition means in practice. It will create confusion, become quickly outdated, and result in legal loopholes which can be exploited for both existing practices and future developments.
Unless this opportunity is seized to build a precise and robust definition into the Act itself, any lack of clarity will result in an increased compliance burden for organisations, as they struggle to understand the scope of data to which their obligations apply. In some sectors, we would expect to see a minefield of legal arguments as companies race to the bottom to exploit any inconsistencies or ‘wriggle room’ in the Act, for as long as possible.
And speaking of wriggle room…
The Final Report also proposes to make explicit what constitutes a valid consent: that consent must be voluntary, informed, specific, current and “unambiguous”. Consent must also be given by a person with capacity, and it must be as easy to withdraw consent as to provide it. So far, so good.
However this is a lower standard than that proposed in the 2021 Discussion Paper, which proposed a test of “unambiguous indication through clear action”.
By lowering the standard, this proposal leaves the door open for unexpected secondary uses of personal information built on implied consent.
And in any case, that proposed definition for consent is then contradicted in two other places in the Final Report. One proposal is to introduce a concept of ‘broad’ consent for research, which does not require consent to be specific or current. Another proposal says that consent will be required for organisations to trade in personal information (great, I first thought), but then it goes on to say that in this case, a ‘consent’ to trade in personal information could be “made a condition of accessing goods or services”, so long as “the trading of personal information is reasonably necessary for (the organisation’s) functions or activities”.
In other words, the Final Report says that forced or bundled consent – which is not voluntary and often also not specific – will be acceptable, when a business wants to disclose your personal information for “benefit, service or advantage”.
That position from the Attorney-General’s Department is surprising, to say the least. Legislating to allow for the concept of ‘forced consent’ is the direct opposite of what the ACCC recommended.
(In their 2019 DPI report, the ACCC called out the problem when a purported consent is tied to the provision of services: “many businesses seek consent to data practices using click-wrap agreements, bundled consents, and take-it-or-leave-it terms where consumers are not provided with sufficient information or choice regarding the use of their personal information”. The ACCC also found this has an impact on consumers, given the considerable imbalance in bargaining power between digital platforms and consumers. This is why they recommended “strengthening consent requirements and pro-consumer defaults” – a recommendation to which the Government offered their in-principle support.)
So introducing carveouts for digital platforms or loyalty schemes, to tie consent to the provision of their services, undermines one of the ACCC’s key recommendations, and arguably the raison d’etre of this Privacy Act review.
Allowing for the concept of ‘forced’ or ‘bundled’ consent is also the opposite of what the OAIC says the ‘voluntary’ test requires, what consumers want, and what the 2021 Discussion Paper proposed. It will also run counter to the articulation of consent in the GDPR, thus putting Australia further out of alignment with global standards. It also goes against what European regulators have said is acceptable when relying on consent as the basis for processing personal data; and what the Court of Justice of the European Union has said too.
Proposal 20.4 (read in its entirety, not just the executive summary version) would effectively legitimise an industry of trading in personal information, in a way that is arguably unlawful today. So this proposal is not only not going to fix our existing problems, it is going to make them so much worse.
It will cement in the worst business practices we have seen. Organisations will be able to bury people in fine print, force them to ‘agree’ to T&Cs in one click, and then claim that they have the ‘consent’ of those individuals, in order to use or disclose their personal information for purposes unrelated to the primary purpose of the collection, on the basis that the exploitation or monetisation of the data is ‘reasonably necessary’ to support the company’s business model.
Reverting to the definition of consent proposed in the 2021 Discussion Paper will help prevent ongoing privacy harms caused by entities exploiting unclear consent requirements.
(It’s also worth noting that the Final Report bid farewell to the promise of those ‘pro-consumer’ defaults, which had in-principle support from the Government in 2019, in favour of ‘clear and easily accessible’ privacy settings, which companies can set to whatever they like and then hide behind choice architecture, the accessibility of which we will be debating for years to come.)
Leaning into transparency, instead of out
The 2021 Discussion Paper evidenced an intention to reduce reliance on the ‘notice and consent’ self-management model of privacy regulation, in favour of stricter limits on collection, use and disclosure. (That’s where the ‘fair and reasonable’ test proposal came from: as a way to strengthen the rules for collection, use and disclosure, without making individual citizens or consumers do the heavy lifting.).
So we were expecting to see only a limited role for notice or privacy policies in terms of delivering on privacy protections. However in the Final Report there are multiple proposals which purport to deal with complex privacy issues by simply saying that something needs to go in a privacy policy.
It’s like the review team ran out of steam, and instead of coming up with meaningful or effective proposals to improve business practices, relied on old-school ideas we know will make zero difference to anyone’s actual privacy.
For example, instead of reforming APP 11.2 to say that personal information must not be retained once the primary purpose for which it was collected has been fulfilled, proposal 21.8 is that privacy policies must include details of a company’s data retention periods. (Which, by the way, proposal 21.7 says companies can set for themselves, according to their “organisational needs”.)
Instead of abolishing the political parties exemption, proposal 8.2 is that political entities will need to publish a privacy policy, sending Australians on a quest to read each political party’s privacy policy in the event that the party holds their personal information. Like those documents will be worth the paper they’re written on.
Another example is that, instead of requiring that profiling and online targeting of individuals only occur on an opt-in basis, proposal 20.9 is that organisations must “provide information … to individuals” about what they’re doing. As if consumers or citizens can do anything with that information but sigh.
And instead of more useful algorithmic reforms such as a right to human review of automated decision-making, a requirement for algorithmic explainability, or a requirement for algorithmic auditability, proposals 19.1 and 19.3 are that privacy policies must mention what types of personal information will be used in certain types of automated decision-making, and individuals can “request meaningful information”. This which would “include an explanation of how a decision was reached”… but only if it was reasonable for the entity to take steps to provide that information. As individuals there is nothing much we can do with that information, other than exercise our right to ‘object’, which itself is pretty weak. An ‘objection’ simply poses a question to the organisation about whether it complied with the Privacy Act; no prizes for guessing what the outcome of most objections will be. The right to object does not extend to finding out whether the model used to make the decision was biased, or simply not trained on enough data to make a reasonable decision based on the individual’s circumstances. Nor does it allow for individuals to seek a human review of the decision.
These proposals are pointless. Transparency over the fact that your personal information is being used – via a document almost no-one ever reads anyway – achieves nothing.
Expecting transparency to deliver privacy protection is like shouting “buyer beware!” while selling a dodgy second-hand car with faulty brakes, and thinking that road safety will somehow be sorted out by individual consumers making informed choices.
Complexity instead of simplicity, administrivia instead of protections
One of the recurring themes in the Final Report is the need to enhance clarity, while minimising compliance burden. Yet there are a number of proposals which will add to organisations’ compliance burden, rather than relieve it, but for little privacy gain.
I’ve already mentioned the extra requirements to add yet more fluff into privacy policies.
The attempt to set standards for not only how ‘personal information’ is handled, but also how ‘de-identified’ and ‘unidentified’ information are to be handled, is another example. This massively over-complicates how the Privacy Act works. It would be better to more clearly define ‘personal information’ to include when individuals can be singled out and acted upon (even if their identity is not known). Then we would not need extra protections or complex rules for some types of ‘de-identified’ or ‘unidentified’ information.
Of course I have sympathy for the costs of compliance, faced by small businesses in particular. But it needs to be recognised that maintaining the small business exemption generates a compliance cost too.
Asking small businesses to apply complex legal tests to figure out if they are even regulated or not is one example of compliance burden. For example, Proposal 6.2 is that certain activities of small businesses could cause them to lose the benefit of the exemption immediately, such as the collection of biometrics for facial recognition, or trading in personal information. It costs time and money for businesses to figure out whether or not they ‘trade’ in personal information; and some businesses will assume incorrectly that they don’t.
By contrast “there is one set of privacy principles for everyone who handles personal information” is a hell of a lot easier to explain, via regulatory comms, guidance, training and support. And the easier it is to understand a law, the easier it is for businesses to comply with it.
(Plus, if the small business exemption were to go, the compliance burden on businesses both large and small, which seek to do business in Europe and elsewhere, would be lifted if Australia was to receive an ‘adequacy’ ruling from the European Commission.)
Maintaining the small business exemption also makes other proposed reforms a bit pointless. Trying to crack down on disclosures of personal information overseas (including the proposal to extend those protections to de-identified data too!), while ignoring disclosures made to the 90% or so of businesses within Australia which are exempt from the Privacy Act now, is a waste of regulatory effort.
We also suggest that the introduction of a distinction between data controllers and data processors will have significant administrative impacts on APP entities, while providing little benefit to individuals. That’s just administrivia and more work for lawyers, instead of actual privacy protection. And again, the compliance burden for understanding and managing that distinction will fall disproportionately on small businesses.
Wimping out on the big exemptions
Speaking of the small business exemption…
There are currently four big carveouts from the scope of the Act: the exemptions for employee records, small businesses, political parties and media organisations.
Seeing these four carveouts go – or at the very least the employee records and small business exemptions – will be high on the checklist for the European Commission when it next comes to assessing the ‘adequacy’ of Australia’s updated laws. So for a review that has alignment with our major trading partners as one of its objectives, you would expect to see some swift action here.
While at first glance it would appear that the small business exemption is to be abolished (and that is how initial media reporting of the reform proposals presented it), on closer reading it becomes apparent that implementing this reform is actually to happen at some undefined point in the future, and is conditional on so many different things first being achieved that it would never happen in reality. For example the report suggests that the small business exemption should only be abolished once “small businesses are in a position to comply” with the Act. How do you measure or demonstrate that? (And who is supposed to decide when that test has been met? It’s not like every large business is perfectly in compliance with the Act already.)
The proposed ‘we promise to abolish the exemption….sometime…maybe’ will not fool the European Commission, and puts an adequacy ruling for Australia at risk.
Right now the compliance burden is simply time-shifted for growing businesses. Start-ups will be better off if they need to think about building ‘privacy by design’ into their products from the start, instead of having to retro-fit once they hit an arbitrary financial turnover.
Plus, how else will the Government achieve its objective to jolt Australia out of our collective cybersecurity daze, unless it implements an economy-wide set of rules for handling data? The Privacy Act is the perfect instrument to lift foundational data security capability across the nation.
So let’s get rid of the exemption now – but grant some latitude to smaller businesses when it comes to enforcement, like an extra year before civil penalties or a direct right of action apply for any breaches, and a tiered penalty scheme.
Meanwhile the other three exemptions are not proposed to be abolished at all. The ‘journalism’ and ‘political acts and practices’ exemptions are proposed to be revised so that some obligations will be imposed, such as data security and data breach notification, but not all the APPs. The employee records exemption is proposed to be revised, but only following further consultation.
Chapters 6 through 9 of the Final Report read like the Department is desperately trying to have its cake and eat it too: somehow mollify the public and trading partners who want action, without upsetting powerful interests like politicians, business lobby groups and the media. The result is a triumph of politics over public expectations: trying to justify the unjustifiable when it comes to the four big exemptions.
Conclusion: our top six fixes
Some surprises, like birthday cake sprung on you by your buddies, are wonderful happiness-inducing events. But some of the surprises in this Final Report have generated more furrowed brows than smiles.
We know the privacy and related harms that can arise from information handling practices which are unfair, opaque, intrusive or insecure. We know where the weaknesses in the current legal regime are. We know what the community expects. Now is the time for the Australian Government to deliver on the promise of meaningful reform.
However, we are concerned that without changes, some of the proposals in the Report will not, in practice, achieve the crucial objectives of reform – and may even run counter to them.
So, what could we do to fix these problems?
In my view, we need six critical things:
- the definition of personal information to clearly include when people can be singled out and acted upon, even if their identity is not known
- the Act to state that targeting or trading in personal information can never be considered a primary purpose or ‘related’ secondary purpose (therefore organisations will need individuals to ‘opt in’ via consent, in order to target the individual or trade in their personal information, unless another law allows it)
- the definition of consent must be to the effect that consent will only be valid if an individual had an informed choice to say ‘no’ to a specific request, unbundled from the terms of service, and then unambiguously and proactively chose ‘yes’
- implement the ‘fair and reasonable’ test: it may not be a perfect model, and there will still be plenty of argy-bargy about what it means in any given context, but frankly any industry arguing against the proposal needs to justify to the Australian public why their data use practices should be allowed to be unfair or unreasonable
- the Act to state that direct marketing will only be ‘fair and reasonable’ if the personal information was collected directly from the individual (who must have been an adult at the time of collection), in circumstances where direct marketing is a directly related secondary purpose to the original purpose of collection, and if the individual was given notice of the collection and proposed use for marketing, and they were given the option to ‘opt out’, and the individual has not opted out; and
- abolish the small business exemption now – but grant some latitude to smaller businesses when it comes to enforcement, like an extra year before civil penalties or a direct right of action apply for any breaches, and a tiered penalty scheme.
Want more?
On 4 April we ran a webinar to understand the Privacy Act Reforms – what’s proposed, what’s next, and how to prepare. The 90 minute recorded presentation, and a copy of the associated handouts, is now available, along with The Privacy Act in a Nutshell, as part of our Privacy Act Reforms Bundle.
The Salinger Privacy submission goes into more detail about the reform proposals, the ones we support and the ones we don’t, as well as potential solutions. Plus more info and links are on our Privacy Reforms hub page.
Photo © Ryoji Iwata on Unsplash