This week I had the pleasure of attending a seminar on the Risk-Based Approach to Privacy. The keynote speaker was Richard Thomas, the former UK Data Protection Commissioner – although as he pointed out in his speech, he never liked the European term ‘data protection’, because privacy is about protecting people.
Richard elaborated on a methodology being developed by the Centre for Information Policy Leadership, in which abstract privacy principles can be applied to a project, according to an assessment of the likely risks and benefits arising. The idea is to go beyond ‘tick-box compliance’, and get to the point of privacy protection – protecting people from harm.
I like to think that’s what we have already been doing with our clients, when we conduct PIAs! Indeed, Richard acknowledged that the adoption of Privacy Impact Assessment (PIA) methodologies in Australia and New Zealand is well advanced of other countries.
During the Q&A, I raised a question: Who should conduct the risk identification for a project, and what should their skills be? I noted that sometimes people in positions of privilege, such as senior managers experiencing career success who are predominantly white, male and middle class, may struggle to imagine privacy harms that they have never personally experienced, such as discrimination, harassment, stalking or family violence.
In particular, I noted that data about a person’s home address, or increasingly geolocation data which can reveal patterns of behaviour including physical location, is often collected and exposed by organisations in a fairly casual fashion, and yet for some individuals, the exposure of their location data could lead to very serious harm.
(Taking the alternative strict legal approach won’t necessarily assist the lay person to identify the heightened privacy risks in their project either. For example, privacy laws in Australia don’t recognise location data as ‘sensitive’ in the way that medical records are.)
Richard’s reply to my question noted that senior managers aren’t doing their job well if they don’t know their customer base, and so they should be capable of recognising when those risks might arise for their customers. The discussion then moved on to examples of companies and government agencies which have got it wrong, and suffered spectacular ‘privacy fails’ as a result.
I suggest that a diverse set of skills is needed to conduct a robust privacy risk assessment. Legal and analytical skills are certainly needed, and so is the ability to understand how data might be collected, collated and presented to system users and third parties. But I believe that some underrated ‘soft’ skills like imagination and empathy are required too.
Something we do intuitively when we conduct a PIA for a client – before we even start to analyse compliance with the relevant privacy principles – is to imagine ourselves ‘standing in the shoes’ of their customers or citizens.
We then ask two questions: “What would I expect to happen?” and “How might I be harmed by this?”
If you are doing a privacy risk assessment of a project, start by asking yourself those two questions too. Avail yourself of whatever resources you can find to help you measure or predict your customers’ expectations, and use your imagination to think of worst-case scenarios.
If you need some prompts to help you imagine the possible harms that might arise, this is where I find Richard Thomas’s work to add the most value. He has articulated a set of potential ‘privacy harms’, which offers a novel approach for those used to focusing on risk or harm from the organisation’s perspective.
This spectrum of privacy harms has tangible or ‘material’ harms at one end (such as physical harm or threats of violence, stalking and harassment, identity theft, financial loss and psychological damage), intangible or ‘moral’ harms in the middle (such as reputational damage, humiliation, embarrassment or anxiety, loss of autonomy, discrimination and social exclusion), and abstract or ‘social’ harms at the other end (such as the threats to democracy, chilling effect on free speech, loss of trust and social cohesion posed by a ‘surveillance society’).
The task of the privacy professional is to pull together a robust risk assessment, encompassing both sharp analysis of compliance with legislation, and the more imprecise ‘what if’ scenarios generated through imagination and intuition.
If you need further assistance with privacy risk identification, or a formal Privacy Impact Assessment, just give us a call.
(And by the way, ‘stand in their shoes’ is also an exercise we explicitly do with participants in our face-to-face privacy awareness training program. If you think your colleagues need some help achieving a shift in mindset about the importance of privacy to your organisation, ask us about which training program would work best for you.)
Photograph © Shutterstock