Salinger Privacy

  • About
    • About Salinger Privacy
    • Videos, Podcasts and Media Mentions
  • Consulting
    • Our Consulting Services
    • Privacy Impact Assessments
    • Privacy Compliance Reviews
  • Training
    • Overview
    • Training Calendar
    • Public Courses and Workshops
    • In-house Privacy Training and Workshops
    • Online Training
    • Webinars
    • IAPP Certifications
    • Login
  • Privacy Resources
    • Privacy Resources
    • Compliance Kits
    • Resources on key privacy topics
    • Free Handbook
    • Login
  • Who We Are
    • Anna Johnston
    • Melanie Casley
    • Samantha Floreani
    • Andrea Calleia
    • Stephen Wilson
    • Chris Culnane
  • Blog
  • Contact
  • Compliance Kits

Man made software in His own image

July 23, 2015, Stephen Wilson

Share this post

Share this post on twitter Share this post on Linkedin Share this on Facebook

In 2002, a couple of Japanese visitors to Australia swapped passports with each other before walking through an automatic biometric border control gate being tested at Sydney airport. The facial recognition algorithm falsely matched each of them to the others’ passport photo. These gentlemen were in fact part of an international aviation industry study group and were in the habit of trying to fool biometric systems then being trialled round the world.

When I heard about this successful prank, I quipped that the algorithms were probably written by white people – because we think all Asians look the same. Colleagues thought I was making a typical sick joke, but actually I was half-serious. It did seem to me that the choice of facial features thought to be most distinguishing in a facial recognition model could be culturally biased.

Since that time, border control face recognition has come a long way, and I have not heard of such errors for many years. Until today.

The San Francisco Chronicle of July 21 carries a front page story about the cloud storage services of Google and Flickr mislabelling some black people as gorillas (see updated story). It’s a quite incredible episode. Google has apologised. Its Chief Architect of social, Yonatan Zunger, seems mortified judging by his tweets as reported, and is investigating.

The newspaper report quotes machine learning experts who suggest programmers with limited experience of diversity may be to blame. That seems plausible to me, although I wonder where exactly the algorithm R&D gets done, and how much control is to be had over the biometric models and their parameters along the path from basic research to application development.

So man has literally made software in his own image.

The public is now being exposed to Self Driving Cars, which are heavily reliant on machine vision, object recognition and artificial intelligence. If this sort of software can’t tell people from apes in static photos given lots of processing time, how does it perform in real time, with fleeting images, subject to noise, and with much greater complexity? It’s easy to imagine any number of real life scenarios where an autonomous car will have to make a split-second decision between two pretty similar looking objects appearing unexpectedly in its path.

The general expectation is that Self Driving Cars (SDCs) will be tested to exhaustion. And so they should. But if cultural partiality is affecting the work of programmers, it’s possible that testers have suffer the same blind spots without knowing it. Maybe the offending photo labelling programs were never verified with black people. So how are the test cases for SDCs being selected? What might happen when an SDC ventures into environments and neighbourhoods where its programmers have never been?

Everybody in image processing and artificial intelligence should be humbled by the racist photo labelling. With the world being eaten by software, we need to reflect really deeply on how such design howlers arise. And frankly double check if we’re ready to let computer programs behind the wheel.

 

Photograph © Shutterstock

Filed Under: Uncategorized

Recent Posts

  • Design jam leaves customers in a privacy pickle
  • What’s in store for privacy law in Australia?
  • Location, location, location: online or offline, privacy matters
  • The Data-Sharing Dilemma
  • Putting a price tag on privacy
  • Why privacy is a public good in need of better protection
  • Re-thinking transparency: If notice and consent is broken, what now?
  • Should I download the COVID-Safe app? The privacy pros and cons
  • Privacy in a pandemic: Keep calm, and remember first principles
  • Privacy in design: Tranquil spaces to be ‘let alone’

Archive

  • 2021
  • 2020
  • 2019
  • 2018
  • 2017
  • 2016
  • 2015

Search

Salinger Privacy we know privacy inside out

Salinger Privacy can help you navigate the complexity of the regulatory environment, and ensure the trust of your customers.

CONTACT US

T: 02 9043 2632
PO Box 1250, Manly NSW 1655
Email Enquiry

© Salinger Consulting Pty Ltd
ABN 84 110 386 537

Our Privacy Policy

Subscribe to our newsletter.

These details will be added to our mailing list to receive the Salinger Privacy eNews and Product News newsletters. You can unsubscribe or adjust your preferences at any time, from the bottom of any newsletter.