A recent determination by the OAIC in the Flight Centre case demonstrates the potential to cause privacy harm when personal information is recorded and stored inappropriately. In that case a free-text field designed for a different purpose was used by some staff – contrary to company policy – to enter credit card and/or passport numbers. This led to the disclosure of almost 7,000 customers’ valuable personal information.
This month’s blog reviews the case, and highlights the implications for all organisations with respect to data security, as well as the role of consent and your privacy policy.
Background
In 2017 Flight Centre – a large travel agency in Australia – organised an event known as a ‘design jam’, which was intended to create technological solutions for travel agents to better support customers. 16 teams participated in the event, without being required to sign a non-disclosure agreement.
Participants were given access to 106 million rows of data representing transactions across the previous two years. There were over 6 million individual customer records included.
Although steps were taken to de-identify the customer records within the dataset, the steps taken were insufficient. In particular, a free text field which remained in the dataset included personal information about almost 7,000 customers. Contrary to company policy, some staff had used the free text field to store details of customers credit card numbers and passport numbers. (The free text field was actually intended to be used by staff to communicate internally about a customer’s booking.) The existence of this personal information within the dataset was not picked up during a review of 1,000 rows of the dataset before it was released to the event participants.
Can a Privacy Policy demonstrate consent?
The Flight Centre case provides a useful illustration of the necessity of meeting all five elements in order to obtain a valid consent. Consent must be voluntary, informed, specific, current, and given by a person with capacity.
The respondent had argued that its Privacy Policy permitted the use of personal information for product development purposes (which was the business objective behind the design jam event) because customers had consented to this use via the Privacy Policy in the course of transacting with the company. However the OAIC disagreed, noting that an organisation “cannot infer consent simply because it provided an individual with a policy or notice of a proposed collection, use or disclosure of personal information”.
Further, the OAIC stated that a Privacy Policy is “a transparency mechanism… It is not generally a way of providing notice and obtaining consent”.
In relation to the particular Privacy Policy, the OAIC found:
- “consent could not be obtained through the Privacy Policy as it was not sufficiently specific, and bundled together different uses and disclosures of personal information”;
- “a request for consent (should) clearly identify the kind of information to be disclosed, the recipient entities, (and) the purpose of the disclosure”; and
- “Any purported consent was not voluntary, as the Privacy Policy did not provide individuals with a genuine opportunity to choose which collections, uses and disclosures they agreed to, and which they did not”.
In the absence of consent, the Flight Centre was found to have disclosed personal information in breach of the Use & Disclosure principle (APP 6).
Data security failures
The OAIC also found that the Flight Centre had breached the Data Security principle (APP 11), stating:
“the storage of passport information and credit card details in a free text field (in a manner inconsistent with applicable policies), and the absence of technical controls to prevent or detect such incorrect storage, caused an inherent data security risk in terms of how this kind of personal information was protected”.
The OAIC also noted:
- “the respondent should have implemented technical controls that would detect whether staff had included credit card details and passport information in the free text field of its quoting, invoicing and receipting system”, and
- a “reasonable step” would have been “to implement an automated scanning technique to review data” to check for any remaining personal information prior to the disclosure.
Of particular interest is the OAIC’s conclusions about the role of business process as well as system design:
“The steps required to protect an entity’s information holdings from unauthorised disclosure will invariably be multi-layered and multi-faceted. Entities should assume that human errors… will occur, and design for it”.
Lessons learned
What can we learn from Flight Centre’s failures?
First, understanding the human element in data breaches is critical. Policies and procedures, and staff training, are not enough. The OAIC noted that organisations should assume that human errors will occur, and should design systems accordingly. Include both technical controls to prevent poor practices, and assurance testing to find and remedy them.
In the case of Flight Centre, while internal policies were clear, the OAIC found that they were not routinely followed, by a number of staff, over a significant time period. Further, the OAIC found that technical controls and assurance procedures were inadequate to address the storage of data by staff in inappropriate fields. This created an inherent privacy risk.
However much like the data breach arising from the public release of a Myki card dataset, which was also released to participants in a hackathon event, no privacy impact assessment was completed because of the mistaken belief that the data had all been de-identified, such that no personal information remained. The lessons learned include to mandate a Privacy Impact Assessment methodology across all projects, and to not take de-identification promises at face value.
Another lesson learned is the need to use available technology to scan for personal information stored within your systems, especially within free text data fields and unstructured data.
A risk management lesson learned is to apply third party risk assessment procedures to unusual situations such as hackathon events, in which third parties – which are not typical contracted service providers, vendors or suppliers – are nonetheless given access to, or copies of, data. Flight Centre admitted that its vendor management policy was not followed in the lead up to the design jam event; nor were event participants asked to sign a non-disclosure agreement, or agree to any terms, before participating in the event.
And the final lesson learned is to not rely on your Privacy Policy to authorise your use or disclosure of personal information. As we said in Why you’ve been drafting your Privacy Policy all wrong, a Privacy Policy is not magic. It cannot authorise you to do anything that the privacy principles don’t already allow. The OAIC has said it clearly: a Privacy Policy is purely a transparency mechanism, and not a way of either providing notice or obtaining consent. If you need consent to authorise your conduct, that consent needs to be voluntary, informed, specific, current, and given by a person with capacity. It cannot be obtained by making your customers ‘agree’ to your Privacy Policy.
So don’t let your customer data turn into a privacy pickle: check staff compliance with policies about data storage, don’t take ‘de-identified’ claims at face value, routinely scan your information assets for protected data, use PIAs and third party risk management strategies on all types of projects, and whatever you do, don’t expect your Privacy Policy to magically shield you from compliance with your Use and Disclosure obligations.
Photograph © Shutterstock