Is it just me, or are things starting to get genuinely creepy around here?
I’m not just talking about the trailer for the new TV show Humans, which looks like a gripping piece of sci-fi drama set in the not-too-distant future. I’m talking about the here and now.
Barbie dolls embedded with voice recognition that enables the doll to hold a conversation with a child, and ‘smart’ TV sets that record what their watchers are saying, and report back to … somewhere. Watches that not only monitor your heart rate, but match that data against your calendar for analysis. Vacuum cleaners that can recognise the objects they bump into. Air-conditioners that know who is in the lounge room, and what their preferred temperature is. Blood pressure monitors that report straight to your hospital.
Sometimes it feels like the word ‘smart’ has become short-hand for ‘surveillance-based business model’. Welcome to the brave new world of the Internet of Things (IoT), in which every little device seems intent on collecting data about you, and likely beaming it to somewhere else for analysis and re-use.
There are plenty of ethical concerns raised about IoT devices. Objections to the ‘Hello Barbie’ go beyond the distasteful (but surely not unexpected) concern about direct marketing of other products to children, to include the recording and analysis of the words of children who have little understanding that they are being monitored, let alone how what they say might be used.
There are also sensible questions being asked about the legal and moral responsibilities of the companies collecting data from connected devices.
What should Mattel do if a child’s words, as interpreted by their Barbie doll, suggest they are suffering physical abuse? How should the doll respond if a child asks about death, or God, or where babies come from? What if another family member is overheard making threats?
Can Fitbit data be used in personal injury law suits? Can your fridge dob you in to your health insurer?
The security flaws when everything is connected
Then there are the security concerns. One report suggests that the average IoT device has 25 security flaws. If hackers can seize control of a moving car, how hard will it be for the bad guys to take over other, cheaper ‘smart’ devices?
If your child’s doll gets hacked, is it going to turn around and bully your child? Or perhaps it will start asking your child to reveal information about your family, like the password to the home security system, and when you will be away on holidays.
Maybe your vacuum cleaner will start broadcasting video from your home. Or perhaps extortionists will disable all the heating systems in connected homes unless the home owner pays up.
Conditioning us to pervasive surveillance?
And then there are the privacy issues.
The NSW Privacy Commissioner has written about the challenges that the IoT poses for our privacy laws, as devices with unique identifiers push the boundaries of what is regulated as “personal information”. Even devices which might be used by more than one person, such as smart meters within a home, are capable of identifying individuals within a group of users, simply from their patterns of behaviour.
But it’s not just about whether any given device and its manufacturer are regulated by a set of privacy principles. The IoT raises deeper concerns about whether our enthusiasm for ‘smart’ devices and connectivity is conditioning us to a world of pervasive surveillance, and automated decision-making.
Associate Professor Mark Andrejevic and Dr Mark Burdon have written about what they call the ‘sensor society’, in which the always-on interactive device is doubling as a tool for constant, passive data collection. Every connected device is capable of being a ‘sensor’, and monitoring its users. This turns privacy principles such as collection limitation, and limits on secondary use of data, on their head: “the function is the creep”.
But does it have to be this way? Are we going to see a push-back, as businesses realise the benefit of instead choosing privacy?
It’s one thing for businesses to spout platitudes like “we take our customers’ privacy seriously”, and another thing entirely to embrace Privacy by Design. But it can be done. We’ve helped a number of our PIA clients translate privacy law into design rules for business processes and solution architecture.
Now it looks like some leading businesses are doing the same. Car manufacturer Audi has recently rejected Google’s industry-partnership push to develop internet-assisted driving, with their CEO noting Germans’ reservations about data collection and describing the car as like “a second living room – and that’s private.”
And Apple is keen to distinguish itself as the pro-privacy player in the consumer tech space, designing its latest operating system to keep personalised data only on the device, instead of being shared.
I’ll put it in words that surely Barbie would understand. Maybe, just maybe, privacy is the new black.
Photograph © Shutterstock