Indirect identification, individuation, disambiguation, distinguishing from all others, or singling out…. call it what you want, but the statutory definition of ‘personal information’ needs to clearly state that it includes when individuals can be singled out and acted upon, even if their identity is not known.
Yet as they now stand, the proposals to amend the Privacy Act do not include this critical reform.
We wrote previously about some of the themes arising from the Final Report into the review of the Privacy Act.
One of the surprises, but not of the happy-surprise-birthday-party kind, was the way in which ‘personal information’ has been treated.
The definition of personal information is a critical threshold definition, because the privacy principles only apply to personal information. If a business can successfully argue that some data is not personal information, they can collect, use, disclose and trade the data with impunity.
Right now, the definition of personal information includes if someone is “reasonably identifiable”. But that phrase is foggy, to the detriment of businesses and consumers alike, who may ask:
Can someone be ‘identifiable’ if a business doesn’t know their name?
The OAIC says yes.
In guidance dating back to 2017, and in a string of case law determinations, the OAIC has maintained that ‘identifiability’ in law does not necessarily require that a person’s name or legal identity can be established from the information. Instead, it implies uniqueness in a dataset: “(g)enerally speaking, an individual is ‘identified’ when, within a group of persons, he or she is ‘distinguished’ from all other members of a group… This may not necessarily involve identifying the individual by name”.
The Attorney-General’s Department says yes.
The Final Report quotes without challenge the OAIC position, and states that “The test does not require that an individual’s legal identity be known provided the information could be linked back to the specific person that it relates to”.
Based on the European and Californian privacy laws and others, our global trading partners say yes.
Each has either explicitly expanded on the meaning of identifiability, or has introduced alternatives to identifiability as a threshold element of their definitions. The GDPR calls it ‘singling out’. The California law (CCPA) includes, within its definition of personal information, data which is “capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household”, without first needing to pass an identifiability test. The 2019 international standard in Privacy Information Management, ISO 27701, is similar.
Australians want the answer to be yes.
In a 2018 Roy Morgan Report on Consumer Views and Behaviours on Digital Platforms, 79% of digital platforms users considered telephone or device information, and 67% of digital platform users considered browsing history, to be information that could reasonably be used to identify them when doing things online. The OAIC’s 2020 Community Attitudes to Privacy Survey showed that only a quarter (24%) of Australians feel the privacy of their personal information is well protected, and that the vast majority (83%) of Australians would like the government to do more to protect the privacy of their data. Public support for strengthening the Privacy Act surged in 2022, in the wake of the Optus and Medibank data breaches.
And the most recent research, released last month by the Consumer Policy Research Centre, found that the majority of Australians regard things like their IP address, device IDs, location data and online search history to be their ‘personal information’ – and that the rates for these categories were even higher than for categories of data like sexuality and disability. This research showed that the majority of Australians were also uncomfortable with that type of data being used by companies to create a personal profile, or with it being collected from or shared with other companies.
But right now, some industries say no, or they are lobbying to make the answer be no, or they add to the fog by obfuscating their way around the terminology when dealing with consumers and pretend the answer is no.
Some industries are exploiting the fog around the definition to match up customer records from different devices and different apps and share user attributes between different companies, by arguing that the data they are using is not ‘reasonably identifiable’, and thus the privacy rules (which prohibit unrelated companies sharing their customers’ personal information without consent) do not apply.
(For example, we have seen industry arguments that no-one can be “reasonably identified” from facial detection, or non-cookie based targeted advertising. And law academic Katharine Kemp has highlighted the disingenuous claims about ‘anonymous’ data made to consumers by media and AdTech companies, especially when compared with what they privately tell brands about their cross-brand data-matching, online tracking, profiling and ‘addressable’ targeting capabilities.)
In an article about the law reform proposals, one industry player was quoted as admitting that reforms will “force us to stop doing some things that we probably shouldn’t have been doing anyway”. Another said, of the use of hashed emails: “It’s very easy to link those two data sets together and then re-identify the personal information”. So if the law is clarified to state that pseudonyms like hashed emails (which facilitate data-matching at the individuated level) constitute ‘personal information’, the result will be “a big impact on existing industry practices”, because “there are thousands of adtech companies and publishers using hashed emails” (to match up data about customers, profile and target them without consent).
As Sven Bluemmel, the Victorian Information Commissioner, put it: “I can exploit you if I know your fears, your likely political leanings, your cohort. I don’t need to know exactly who you are; I just need to know that you have a group of attributes that is particularly receptive to whatever I’m selling or whatever outrage I want to foment amongst people. I don’t need to know your name. And therefore, arguably depending on how you interpret it, I don’t need ‘personal information’. I just need a series of attributes that allows me to exploit you.”
The media publishing and AdTech industry players know that this is true, but they will do what they can to maintain fog around the phrase ‘reasonably identifiable’, so that their practices can stay in the shadows.
So long as the wording in the statutory definition of ‘personal information’ is not clear and precise, the fog will not dissipate. The much-touted reforms will fail to stop these widespread, covert data-sharing practices.
Fear of the fog clearing is the reason that industry is pushing to water down the proposed reforms – or kill them off entirely. In what has been described as a ‘privacy counterstrike’, industry lobby groups representing digital platforms, media giants and advertising companies are “planning a cross-industry counteroffensive in a bid to wind back key proposals” in the Final Report.
One of the digital and AdTech industry’s objectives is to lobby for “a more reasonable definition [of personal information] … [the proposed definition] doesn’t seem workable”.
And what is this radical and unworkable proposal in the Final Report that is deserving of such a focused counteroffensive? To change the word ‘about’ to ‘that relates to’. That’s it. Three words. THREE WORDS. As per the Final Report, that’s the only actual change proposed to the statutory definition of ‘personal information’. The rest would be put in guidance, or in a list of things which might or might not be personal information, or in a list of things that organisations should ‘have regard to’, when ‘doing their own assessment’ about what the definition might mean.
Which means that industry lobbying has already been successful, because this is a watering down of what was proposed by the Department in 2021, in their Discussion Paper on the review. Then, the proposal included adding a whole extra sentence! Perhaps the review team was persuaded that the sky would fall in if they dared to add the following words into the Act, as they originally proposed in 2021 – and hold on to your hats here, because this is scary radical stuff:
“An individual is ‘reasonably identifiable’ if they are capable of being identified, directly or indirectly.”
The 2021 Discussion Paper stated that such a definition “would cover circumstances in which an individual is distinguished from others or has a profile associated with a pseudonym or identifier, despite not being named”.
I know, vive la revolution it ain’t, but nonetheless this modest proposal from 2021 doesn’t even appear in the Final Report.
According to the Final Report, only one submission argued against the proposition that the definition should be amended to expressly include when someone can be distinguished from all others in a group (even if not named) in order to be profiled, targeted, or acted upon in some way.
And yet … the proposals in the Final Report don’t deliver the clarity or strength that we need.
Instead of proposing an amended definition of personal information that would clearly encompass the types of individuating identifiers that allow online behavioural advertising and other practices to go unchecked, the Final Report proposes a whole separate regime for regulating certain use cases, like direct marketing, online targeting and trading. (And they do this by not touching the definition of ‘personal information’, but by saying that for these special regulations of these special use cases, sometimes de-identified or even unidentified data will be within scope as well.)
But then when you read the details of those new rules in Chapter 20 of the Final Report, the only substantial right is to opt out of being shown targeted advertising. Those proposals will not impact on the collecting of information, or the building or sharing of profiles, or the use of our information to create ‘lookalike audiences’ at all; all the stuff behind the scenes gets a free pass.
Plus it’s not like privacy harms only come from direct marketing, online targeting or trading in personal information. The ability to distinguish one individual from others, in order to track, profile, locate, contact or influence them, is also the starting point for stalking, surveillance and abuse. Privacy harms can come from personal digital assistants, chatbots or generative AI giving erroneous advice, or apps which monitor health then leak information to third parties. They can come from anti-abortion activists targeting women using geo-fencing.
Individuation online is the diesel that fuels the algorithmic engines, amplifying the voices of influencers, and powering the trains of online hate, misinformation and extremism which lead to everything from the explosion in mood disorders to pro-anorexia content to Holocaust denial and genocide.
As we’ve said in our submission to the Privacy Act review, playing legislative whack-a-mole by trying to regulate specific use cases is guaranteed to make the Act out of date the day it is amended. Regulating specific use cases will also just shift the battleground, such that arguments will become about what business practices are in or out of those defined use cases. (And that’s before we even get to the proposed extra rules for ‘unidentified’ and ‘de-identified’ data as well, which would not be needed if the definition of personal information was fixed instead.)
We also already know that shifting definitional matters to OAIC guidance just doesn’t work. The practices I’ve described here take place now, despite the existing OAIC guidance that the definition of personal information (and therefore all the privacy rules) apply to data which enable an individual to be ‘distinguished’ from all other members of a group, without needing to know their names.
As the Final Report itself states, codifying OAIC guidance makes propositions “more readily enforceable”.
That’s why it is essential to amend the statutory definition of ‘personal information’ in the Privacy Act, which applies across all industries, and all use cases, and cannot be ignored.
Indirect identification, individuation, disambiguation, distinguishing from all others, or singling out…. call it what you want, but the statutory definition of ‘personal information’ needs to clearly state that it includes when individuals can be singled out and acted upon, even if their identity is not known.
So dear Attorney-General, please take this historic opportunity to strengthen but simplify and clarify the law. Just one extra sentence will do it:
“An individual is ‘reasonably identifiable’ if they are capable of being distinguished from all others, even if their identity is not known.”
That one extra sentence would clear the fog, protect Australians the way they expect, simplify compliance, stop the disingenuous claims by industry, and bring Australian law closer to alignment with that of our trading partners, by building into the wording of the Act itself what is already in determinations and guidance from the OAIC.
It’s only one extra sentence, but it will make all the difference.
Photo © Shutterstock
Want more?
On 4 April we ran a webinar to understand the Privacy Act Reforms – what’s proposed, what’s next, and how to prepare. The 90 minute recorded presentation, and a copy of the associated handouts, is now available, along with The Privacy Act in a Nutshell, as part of our Privacy Act Reforms Bundle.
The Salinger Privacy submission goes into more detail about the reform proposals, the ones we support and the ones we don’t, as well as potential solutions. Plus more info and links are on our Privacy Reforms hub page.
PS – post updated 24 April to correct a link to industry comments published in February.