Curious minds select the most fascinating podcasts from around the world. Discover hand-piqd audio recommendations on your favorite topics.
piqer for: Global finds Technology and society Health and Sanity
Nechama Brodie is a South African journalist and researcher. She is the author of six books, including two critically acclaimed urban histories of Johannesburg and Cape Town. She works as the head of training and research at TRI Facts, part of independent fact-checking organisation Africa Check, and is completing a PhD in data methodology and media studies at the University of the Witwatersrand.
I recently read Google’s Machine Intelligence group leader Blaise Aguera y Arcas's truly excellent piece on how machine learning and AI currently enable a more rigorous expression of pre-existing human prejudice, as evidenced by a very flawed Chinese study that promised to determine criminal traits through rapid reading of facial features. (Ring any bells? See my previous Piq on this channel!). The short version is: supposedly unbiased machines designed to "detect" criminality in a human face are in reality coded (by prejudiced humans) to pick up on existing human biases for or against certain features, most of which have zero link to behaviour. Or to put it another way: the code recreates, with machine-like precision, the way humans judge faces. And, it turns out, we're not really that good a judge of character.
Fast forward to a week or two later, and another article pops up on facial recognition software bias, this time from the intriguing science magazine Undark (there's a gorgeous story behind the name, look it up). The story covers some similar themes, but adds some dimensions that are a) actually, kind of well-known; and/but b) bizarrely still not really talked about. Which is how the technology that captures human faces – and that includes everything from old-school camera film to digital camera phones – does some really strange stuff when it comes to darker skin tones and facial features that fall outside of whiteness. Did you know, for example, that Kodak and Polaroid colour film was originally only designed for white skin? Now imagine this same approach, but digitised. Facial recognition software that tags black faces as primates, or is unable to read Asian faces because of their different eye shape. And now these same or similar proprietary codes are used by US law enforcement agencies to match crime photos, or perform surveillance, with equally flawed results.
Until we acknowledge the racism and bias in our code, we won't be able to fix it.