Curious minds select the most fascinating podcasts from around the world. Discover hand-piqd audio recommendations on your favorite topics.
piqer for: Global finds Technology and society Health and Sanity
Nechama Brodie is a South African journalist and researcher. She is the author of six books, including two critically acclaimed urban histories of Johannesburg and Cape Town. She works as the head of training and research at TRI Facts, part of independent fact-checking organisation Africa Check, and is completing a PhD in data methodology and media studies at the University of the Witwatersrand.
The intersection of algorithms, data, ethics and humanities is, in many senses, the real coal face of the supposed fourth industrial revolution. Not whether or when our AI or machine learning programmes will achieve some next tier, or the size of our computer chips, or supercomputing efficiencies, but what it all means to us, the makers of these things. Hard-coded data patterns are going to shape and unshape us in much the same way as new highways bypassed some towns and put others squarely on the map.
Many commercial algorithms — which harvest or access data (often about us) or both — have two distinct areas of opacity: one is how they operate, the 'black box' problem, where the inputs and outputs are seen but the workings are obscure; a second, equally important unknown, is what is the potential impact of the (real or hypothetical) outputs. Negative impacts range from outrageous to subtle: What might happen if facial recognition software is used to 'determine' whether a person is gay or not? What happens if black men with tattoos are labelled as potential gang members? What happens if being related to someone who has been convicted of a criminal offence will make it harder for you to get accepted to college or get a bank loan?
When asked about the impacts and potential dangers of utilising these pieces of code, the common response of many of those who work(ed) on the data and algorithm side is to say: 'I don't know. I'm just the engineer'. But this article, by an assistant professor at Washington University, argues that it is no longer good enough for coders to only worry about their code executing perfectly or correctly — it is now all of our jobs to be as transparent as possible, and not only acknowledge the 'data violence' as she calls it (her phrase for the inherent and inadvertent harm done through unthinking or thoughtless applications of data programmes), but to engineer strategies to force us to do better as we progress.
Stay up to date – with a newsletter from your channel on Technology and society.