Best to peek carefully into your Christmas stocking this year, for Santa may have brought you more surveillance and security risks than you bargained for.
With the booming market for voice-controlled virtual personal assistant devices like Google’s Home and Amazon’s Echo, and warnings from the former head of MI5 about hackable smart toilets (which, frankly, doesn’t scare me quite as much as a hackable Boeing 757), the attention of regulators is finally turning to the impact of the Internet of Things (IoT) on our privacy and security.
When every little device can be connect to the internet, the questions include:
- Will the device stop working if it needs a security patch? When I want a cup of tea, I want it now, not when my kettle has finished updating its software.
- Will it be up to the consumer to check for, find and install security patches? A worldwide study of community attitudes towards the IoT found one third of people thought it was the consumer’s responsibility, while about the same number felt it is the responsibility of the manufacturer or retailer.
- But what if the manufacturer doesn’t even issue security patches – will my smart fridge stop working if the manufacturer goes broke?
- Can the device be remotely controlled by the manufacturer? What will prevent your garage door opener from being remotely disabled by the manufacturer because you left them a bad product review? If Tesla can update your car wirelessly, can it also prevent you from driving somewhere you shouldn’t be? (And who gets to decide where a car should or should not be?)
- Can your data be ‘monetised’ by the manufacturer? Are you OK with your smart vacuum cleaner hoovering up your data along with your dirt, or your TV watching you?
- In fact, were you even told about how your data might be used? A global sweep by Privacy Commissioners around the world found that 71% of IoT devices do not offer a privacy policy or collection notice to the consumer. And for those that do, the T&Cs are ridiculously long and complex. (I love this stunt, in which a consumer advocacy group ask the Norwegian Consumer Affairs Minister to go for a jog while they read her the privacy policy from her fitness tracker. The Minister manages to run 11km in the time it takes for the policy to be read out to her.)
- Can your data be accessed by your insurer? As if I want my life insurer finding out from my FitBit data that I decided to stay home with a packet of Pringles instead of going out for a jog. Every night.
- Can your data be accessed by law enforcement? Can your driverless car be controlled by police?
- Can the device be manipulated by third parties? My favourite recent stories include the Burger King TV ad designed to set off Google Home devices to describe their products, and Amazon’s Echo device being used by a child to accidentally buy a dollshouse from Amazon, which, when it was reported on the TV news, in turn prompted all the devices in homes where that TV news story was playing to also buy dollshouses.
- Can the device be remotely controlled by hackers? Everything from your kids’ toys to your own, er, adult toys may be vulnerable.
- Can simple devices be used to access more valuable systems or data? When a casino was hacked via its IoT fish tank, I felt like real life is now disturbingly like the absurdist plot of one of those bad sequels to Ocean’s Eleven.
When stories like these abound, it is no wonder that consumers’ fears about privacy are suppressing demand for IoT devices.
Tech journo Stilgherrian has asked How many must be killed in the Internet of Deadly Things train wrecks? Academics have called for an IoT Code of Ethics. And security researcher Troy Hunt has argued that consumer-oriented IoT devices should come with warning labels.
Indeed, the Australian Government recently announced it would introduce a rating system for connected household devices backed with legislation if the industry did not self-regulate quickly enough. (Hey, how cute is the idea of a cyber-kangaroo logo!)
And so just in time it seems, the IoT Alliance Australia has published its industry-led Good Data Practice guidelines. Like the many examples we have cited above, the IoTAA guidelines focus on consumer-facing IoT devices, rather than bigger ticket ‘smart cities’ programs to manage lighting, parking, traffic, energy and waste in public spaces which, as the IoTAA recognises, raise their own privacy issues and thus need more specific guidance.
In addition to addressing the kinds of security vulnerabilities illustrated above, the IoTAA guidelines delve deeper into privacy concerns. Indeed, some of their Good Data Practice principles will sound pretty familiar to anyone who already works in privacy, like: one, follow privacy law, two, build-in privacy by default and use privacy by design, and three, be accountable. These are sound principles which should apply in any sector.
But there are also some privacy challenges which are comparatively novel in this world of IoT, and the IoTAA guidelines call these out. For example, IoT devices in the home may be used by people other than the consumer who purchased them; and indeed those other individuals may have no awareness that the devices exist, let alone are already monitoring them.
(Which reminds me of the time last year when I popped over to a friend’s place and we chatted for a while before he suddenly said ‘Hey Alexa turn down the music’ and after I realised that there wasn’t actually some other person hiding around the corner called Alexa, it completely freaked me out to think that the device on the kitchen bench I had thought was some trendy new Scandi pepper grinder was actually listening to everything I said. Luckily all we had been talking about was the cost of kitchen renovations because, well, apparently I am a Sydney-dwelling cliché.)
The guidelines also note that “Many B2C IoT services reach into homes and other domestic, sometimes intimate environments, and enable observations and inferences as to private behaviour that otherwise are not possible”. So, like, for all those times when conversations and activities in the home are a little more spicy than discussions about what type of splashback to choose.
The proposed responses are eminently sensible too, with the guidelines stating that device designers and manufacturers need to design on the basis that the device must be safe for a child to use, and that communications to consumers must be understandable by someone with a “reasonable but below average” level of literacy. It criticises attempts by manufacturers to shift data security risks onto consumers, noting that consumers cannot be expected to constantly monitor the use of their devices and be knowledgeable or skilled enough to install updates and patches.
The guidelines also deliberately take a broader view of the data that needs to be considered in a device’s design, beyond information about individuals who are identifiable (i.e. what privacy law protects now as ‘personal information’) to also include what they describe as ‘private’ because it is “domestic or confidential in nature” even if no individual is identifiable from the data. Bravo, IoTAA, for recognising that privacy harms can arise through what I describe as individuation, as well as identification. And hooray, the guidelines also stress data minimisation. In other words, let the consumer be just a consumer, not a secondary product.
I believe these new guidelines are a really positive step, to help designers and manufacturers figure out how to build us the Internet of Safe And Useful Things, instead of an Internet of Stupid Dangerous Invasive Things. Hopefully they will give manufacturers pause for thought before they continue down the road of just sticking a chip in everything.
From smart clothes pegs telling you when your clothes are dry, to a smart umbrella telling you it’s raining, to hair brushes telling you, um, something about how to brush your hair, to smart toilet rolls that can identify you (why??), there is mirth to be found poking fun at all the dumb things allegedly turned ‘smart’. Because really, does anyone genuinely need their dental floss dispenser to be connected to the rest of the world? I would suggest that our obsession with building connectivity into every little thing is getting out of hand, and it is time for a sensible re-think.
As technologist Vikram Kumar told a Technology & Privacy forum in New Zealand last year, sometimes, “the best interface to control your lights is a light switch”.
(April 2018 update: If you would like some privacy tools to help you assess the risks posed by a new project, or if you are wondering how the GDPR’s requirement to incorporate Data Protection by Design should be implemented in practice, check out our range of Compliance Kits to see what suits your needs.)
Photograph (c) Shutterstock