Channels
Log in register
piqd uses cookies and other analytical tools to offer this service and to enhance your user experience.

Your podcast discovery platform

Curious minds select the most fascinating podcasts from around the world. Discover hand-piqd audio recommendations on your favorite topics.

You are currently in channel:

Technology and society

Nechama Brodie
Author, fact-checker and academic
View piqer profile
piqer: Nechama Brodie
Monday, 22 May 2017

The Coded Gaze: Racial Bias In Facial Recognition Software

I recently read Google’s Machine Intelligence group leader Blaise Aguera y Arcas's truly excellent piece on how machine learning and AI currently enable a more rigorous expression of pre-existing human prejudice, as evidenced by a very flawed Chinese study that promised to determine criminal traits through rapid reading of facial features. (Ring any bells? See my previous Piq on this channel!). The short version is: supposedly unbiased machines designed to "detect" criminality in a human face are in reality coded (by prejudiced humans) to pick up on existing human biases for or against certain features, most of which have zero link to behaviour. Or to put it another way: the code recreates, with machine-like precision, the way humans judge faces. And, it turns out, we're not really that good a judge of character.

Fast forward to a week or two later, and another article pops up on facial recognition software bias, this time from the intriguing science magazine Undark (there's a gorgeous story behind the name, look it up). The story covers some similar themes, but adds some dimensions that are a) actually, kind of well-known; and/but b) bizarrely still not really talked about. Which is how the technology that captures human faces – and that includes everything from old-school camera film to digital camera phones – does some really strange stuff when it comes to darker skin tones and facial features that fall outside of whiteness. Did you know, for example, that Kodak and Polaroid colour film was originally only designed for white skin? Now imagine this same approach, but digitised. Facial recognition software that tags black faces as primates, or is unable to read Asian faces because of their different eye shape. And now these same or similar proprietary codes are used by US law enforcement agencies to match crime photos, or perform surveillance, with equally flawed results.

Until we acknowledge the racism and bias in our code, we won't be able to fix it.

The Coded Gaze: Racial Bias In Facial Recognition Software
7.5
2 votes
relevant?

Would you like to comment? Then register now for free!