Last Sunday evening, as I was cosily snuggled up on my sofa watching a murder mystery, my phone started beeping like mad. I had multiple text messages from friends and family, all asking my advice on the same thing: ‘What’s with the covid app? Should I download it? Is it a privacy risk?’
There is no ‘one size fits all’ answer to this question. So let’s run through the privacy pros, cons, and the ‘yet TBD’ features of COVIDSafe, so that you can decide for yourself.
Privacy positives
- Only people who download and register for the app have any information collected at all.
- The app does not track geolocation data.
- Registration data (the name, age range, postcode and telephone number you supply at the time you download the app and register to use it) is held in the National COVIDSafe Data Store, which although operated by the federal government is (we have been told) not accessible by the federal government, only by State and Territory health officials. (But then again, we have also been told that the federal government will use the system to send out SMS messages; that “data about generation of encrypted user IDs to create de-identified reports about uptake of COVIDSafe will be prepared by the Digital Transformation Agency” which is a federal government agency; and that if you want to delete your data you have to go through the Administrator which appears to be either the federal Department of Health or the DTA or a contractor. So maybe I should put this one in the ‘cons’ column.)
- You don’t need to provide your real name. (But if you call yourself Mickey Mouse on the app, remember to not hang up if you receive a call for Mickey saying you have maybe been infected… )
- There are multiple points at which you must consent; and you can delete the app any time you like.
- Using a Bluetooth ‘handshake’, the app collects encrypted device IDs from other devices your device was near. People who were physically near you will not find out your name or phone number from the app. (But, the app does collect the make and model of other devices you were near and stores this unencrypted on your device; so you or someone with access to your phone could start to figure out who you were near. And it is data about all devices with the app, not just those you were near for more than 15 minutes.)
- The app only stores the data from other devices for a rolling 21-day period.
- According to the Department of Health, State and territory health officials can only access app information if someone tests positive and agrees to the information in their phone being uploaded. The health officials can only use the app information to help alert those who may need to quarantine or get tested.
- Legal protections specific to this app and the related National COVIDSafe Data Store have been introduced. Importantly, the Determination issued by the Minister for Health under s.477 of the Biosecurity Act recognises and prohibits behaviour which could otherwise create pseudo-compulsory scenarios, such as employers only allowing staff to work if they are using the app, or service providers only allowing you into their shop or onto their train if you have the app. (The criminal penalties come via s.479 of the Biosecurity Act 2015.) Also, it is a crime for an unauthorised person to decrypt the ‘handshake’ data exchanged between users’ phones.
Privacy downsides
- My friends and family could not figure out their ‘is this a privacy risk for me?’ answer for themselves. That suggests the comms around the tech are not clear enough. This in itself is a privacy fail. Transparency is critical to facilitate informed consent.
- Contrary to earlier government promises, the source code has not been released for independent cybersecurity experts (or armchair amateurs for that matter) to review and test. Promising that the source code will come in a couple of weeks leaves room for concern that there might be vulnerabilities in the system which have not yet been found; for example that devices’ locations could be tracked by third parties (shopping centres, burglars) from the unencrypted data shed via Bluetooth from devices using the app. Going live without fulfilling that promise is not a good look for a government which has struggled with public trust in its data handling practices and technological competence.
- On Android phones, users must give permission for location to be recorded. Not that the app’s current design actually does track location (we are told), but a later update could change that for unsuspecting users.
- While the Privacy Impact Assessment (PIA) Report was publicly released as promised, it was done so at the same time as the app. No time for journos to source expert review and commentary before the app went live with uncritical exhortations to the public to download the app ‘because it got the privacy tick’. In fact the PIA on COVIDSafe did not examine compliance, or risks posed, by the State and Territory health departments which will actually be accessing and using the identifiable data, which are covered by a different patchwork of privacy laws. (And in the case of SA and WA, no privacy laws.) The scope of the PIA was limited to the federal Department of Health’s compliance with the federal Privacy Act. The PIA Report’s authors called out this limitation in their report, along with the lack of time available to consult with either State and Territory privacy regulators, civil society representatives or other experts. The PIA Report is not quite the all-encompassing ‘privacy tick’ the government would like us to believe.
- The legal protections in the Determination are temporary, and currently at the whim of a Minister, so they could be scrapped tomorrow without Parliamentary oversight. Parliament is expected to create a more permanent legal framework when it sits in May.
- Contrary to promises by government ministers, the legal framework does not yet prohibit law enforcement access to app metadata, which is why the Attorney General vowed to amend the telecommunications laws. Promising to introduce a more robust legal framework to deliver on those political commitments sometime afterthe app has gone live is reckless and a breach of faith with the Australian public.
- There is also an argument that the legal framework does not prohibit access to the data in the National COVIDSafe Data Store by agencies armed with a warrant, court order or their own ‘notice to produce’ powers.(This is because while s.477(5) of the Biosecurity Act says that the Minister’s Determination applies despite any other law, it also says at s.477(1) that the scope of the Determination is about what is necessary to prevent or control the spread of the disease; so to the extent that police conducting a murder investigation want access to data and their access to that data would not prevent the app or contact tracing from operating as per the Minister’s Determination, does the Determination really stop them?) This issue is not only in relation to police agencies, but national security agencies, anti-corruption bodies, Centrelink and the ATO, all of which have their own powers to compel other organisations to hand over data to them.
- Also, there is an open question as to whether the legal framework does (or even constitutionally can) regulate what happens ‘downstream’, once data has been copied by State and Territory health officials from the National COVIDSafe Data Store into their local systems. Minister Hunt’s Determination implies that it does cover State and Territory officials (because at cl.7(4) it exempts them from the requirement to keep all data in Australia); but the PIA Report states that the federal Department of Health loses “effective control” once the data passes to the States and Territories, and the Department of Health’s acceptance of the PIA Report’s Recommendation 12 implies that at best, the Commonwealth can only seek to have the States and Territories ‘agree’ to a data use protocol. Once held by State/Territory governments, we must rely on our existing (incomplete) patchwork of State and Territory privacy laws to regulate how officials in State and Territory Health Departments can or can’t use the data, how long they store it, how securely they store it, how they authenticate authorised users, what prohibitions and penalties are available to deter misuse, and preventing police (etc) to access the data via their local health department. Binding State and Territory departments to an agreement with the Commonwealth Department of Health not to use the data for any purpose beyond contact tracing does not remotely cover the privacy risks to individuals posed by data breaches, deliberate misuse or police access via State and Territory Health Departments. Instead we need a legal framework which includes all downstream users and uses, as law firm Gilbert+Tobin has suggested.
- The legal framework does not make provision for independent audit, assurance or oversight of the operations of the app, the National COVIDSafe Data Store, or the downstream use of the data by State and Territory health agency users. There is no single complaints mechanism or opportunity for redress for victims of a privacy breach. (Criminal penalties don’t help victims.) So we have to rely on our patchwork of federal, State and Territory privacy laws, which often don’t allow redress for the victim of a privacy breach if a government agency can describe the breach as the action of a rogue employee.
- The legal framework does allow the federal government to use app data for “producing statistical information that is de-identified”. There have been some spectacular de-identification fails by governments here in Australia, so this makes me nervous. And of course, someone in the federal government needs to access the identifiable data in order to first de-identify it. Who will that be? Who checks they are doing the right thing?
- Deleting the app from your phone will not trigger deletion of your data in the National COVIDSafe Data Store. For that, you have to ask the “COVIDSafe Administrator” (which is who? the government website does not clarify) and the form includes a broadly-drafted ‘consent’ (which is not a valid consent because it is not optional), and which talks about using the data to respond to the disease, not, as you would hope, ‘to complete my deletion request’: the tick box says “I … consent to the information provided being used and disclosed by the Australian Government to enable the Commonwealth, state and territory governments to respond to COVID-19”.
- The period for which identifiable information (the name, phone number etc you supplied at registration, plus if you were infected, and presumably also if you were contacted about possible infection) will be held in the National COVIDSafe Data Store, and accessible by State and Territory health officials, until the Commonwealth decides to delete it “after the COVID-19 pandemic has concluded”. Which might be… whenever.
- We don’t yet know if the app will work. (And there is no clear metric for what success looks like.) Operational issues are also privacy issues, because if you are trying to weigh up privacy risk versus public health benefit, you need to be able to quantify whether those health benefits are going to be realised. Will enough people download the app? Will it work properly on iPhones? Is the ‘15 minutes at 1.5m’ an accurate proxy for virus-catching risk? (If an infectious person sneezes in my face as they walk past me, I am at risk, but the app won’t know.) Will there be so many false positives that the manual contact tracers give up using the app as a contact tracing tool? (If I was on the other side of a sealed glass wall from the infectious person, was I really at risk of catching the virus from them?)
Could try harder
- There were other design decisions which could have been taken, to make the app way closer to achieving privacy and security via anonymity for users. Instead of creating a data store of people’s names and phone numbers, data could have been processed and push notifications issued almost entirely on and between people’s phones, with only randomised strings of gibberish stored in a public register. (Check out this comic for a quick and easy explanation of the privacy-preserving alternative model known as the DP-3T protocol being developed in other countries.) If the Australian Government had taken a proper Privacy by Design approach, as promoted by over 300 academics across 25 countries, we could have had an app with almost none of the privacy concerns, which wouldn’t have then triggered the need for urgent bespoke legal protections, because no identifiable information would ever have been stored by anyone. The failure to implement a DP-3T decentralised anonymised trace-and-notify model (without at least first considering it and then justifying on public health grounds why it was rejected as a model) is a significant privacy fail. (The PIA Report suggests the Department had chosen the current model, and it was not within scope for the PIA to consider other, less privacy-intrusive models.) It might also be a public health fail, because a more privacy-preserving model might have engendered greater public trust which might have led to higher download rates.
- Some people face higher privacy and safety risks in their everyday lives than many other people. Systems and products should be designed to protect those who are most vulnerable to privacy harm. These include victims of family violence, celebrity stalking and other physical threats; serving members of the judiciary, law enforcement and defence forces; political activists, journalists and whistle-blowers; and people who could be blackmailed or sacked if they were known to be frequenting a brothel or having an affair or talking to a competitor. If a system is not designed to protect the most vulnerable, their particular risks should at least be highlighted in a transparent way by the government, so that those individuals can make their own informed decision. Neither happened in this case.
- Politicians responding to genuine concerns from the public about privacy and security should not resort to jingoistic ‘Team Australia’ pseudo-patriotic rubbish, or ‘download this app or we can’t lift the restrictions’ bargaining or ‘maybe I’ll make it mandatory after all’ threats in response. Be truthful about the limitations, and seek to understand why some people have entirely legitimate fears about their privacy and safety. Be humble in acknowledging that concerns and criticisms come from a place of deep distrust and disquiet which has been entirely caused by a series of own goals by the Australian Government, given its appalling recent record on privacy across many fronts including Robodebt, doxing a Centrelink recipient for criticising the government, CensusFail, anti-encryption laws, pursuit of whistle-blowers and journalists, failures to comply with its own metadata laws, attempts to silence researchers who identify re-identification risks in government-published datasets, and more.
- Journalists and commentators responding to genuine concerns from the public about privacy and security should not rely on lazy ‘Facebook/Google already knows everything about me anyway’ non-analysis in response. Wake up and be critical in your thinking! (Um, maybe you shouldn’t let Facebook/Google know everything about you?) Government can tax you, fine you, cancel your driver’s licence, outlaw your profession, restrict your movements, seize your goods and throw you in gaol. Facebook cannot. The citizen/government relationship significantly affects the privacy risk profile of our interactions with government compared with private companies.
- Know that middle aged white men in white collar professions (yes I am looking at many of you, above-mentioned politicians, journalists and commentators) are the least likely category of people to face discrimination, violence, harassment or economic uncertainty, and thus have the least to lose from any privacy violations. So please understand that not everyone shares your risk profile. Be more inclusive in your thinking, and calibrated in your calculations of risk.
- The same goes for those who declare ‘I don’t trust the government therefore I won’t use the app’. Instead of holding a default ‘don’t trust’ position, I would argue for a more nuanced balancing of the pros and cons, by each individual, reflecting the privacy and safety risks they face, as well as their willingness to contribute in a small way to (maybe) helping to slow the spread of a terrible disease.
- The legal protections need clarification. Privacy advocate and legal academic Graham Greenleaf described the Determination as “flawed (despite good points): ‘proximity’ is undefined; uploading is at risk if someone else has possession/control of your phone; the deletion date is ill-defined; and the anti-coercion (clause) needs tightening”.
Conclusion
The app is not ‘one size fits all’, but maybe it is ‘one size fits enough’. From the point of view of a privacy advocate, it could certainly be better, but I also give the government credit for understanding that implementing privacy protections would be essential. And as they say, perfection is the enemy of good. And maybe, for this project, at this particular time in history, the privacy protection is good enough, for enough people. But – this experience also illustrates the importance of considering privacy earlier in the design cycle, and with an open mind about alternative designs.
So, what to do about the app? To download or not to download, that is the question.
No, the app is not as privacy-invasive as Facebook. (But if that is the standard by which I measured privacy risks in projects, I would have given up years ago.) Yes, it could have been designed better.
But, ever the optimist, my advice is this: If you don’t face a particular threat to your privacy or safety in your everyday life (such that if your name, postcode and phone number, and possibly inferences about who else you have been near and where you have been, were accessed by your violent partner or police or your boss or by a third party in a man-in-the-middle attack or leaked in an almost-inevitable data breach you would not have a particular reason to be worried), and if you need to commute on public transport or serve customers or otherwise be close to strangers or large groups of people for decent chunks of time, then the health benefits you can offer to the people in your physical proximity by downloading the app likely outweigh any privacy risk to you. So if you feel comfortable with the app, go ahead and do something great for your fellow humans.
If you’re not yet comfortable (but don’t face particular privacy or safety risks in your everyday life), wait until there is a proper legal framework in place, and the source code has been pulled apart by independent experts and found to be secure.
Just remember that you will only receive health benefits for yourself if those around you also download and can use the app and are in fact using the app when they are near you. Those of us with all the best intentions, and no specific reason to be concerned for our own privacy or safety, but who have an iPhone (or who have a flat battery, or who left their phone at home, or who don’t own a phone at all) … well, given all those ‘ifs’, whether the app will penetrate the populace deeply enough to enable the benefits of the app to be realised is an open question.
And be vigilant in your scrutiny and demands for accountability. The federal government cannot be allowed to backtrack on any of its privacy promises about the app, and yet more can be done to the legal framework to improve the privacy protections further, without impacting on the public health benefits.
Perhaps most importantly, from a privacy advocate’s point of view, as with multiple governments’ many fiscal, legislative and policy responses to the pandemic (everything from stimulus payments to changes to liquor licensing rules to procuring stockpiles of hand sanitiser for schools), we have seen how quickly good things can be done by governments, when they care. Don’t ever forget that lesson. When we demand legislative protections, policy solutions and better technology design to protect our privacy, governments are actually capable of delivering, fast. Don’t ever take ‘too hard’ for an answer on privacy protections again.
Photograph (c) Shutterstock