Channels
Log in register
piqd uses cookies and other analytical tools to offer this service and to enhance your user experience.

Your podcast discovery platform

Curious minds select the most fascinating podcasts from around the world. Discover hand-piqd audio recommendations on your favorite topics.

You are currently in channel:

Technology and society

Nechama Brodie
Author, fact-checker and academic
View piqer profile
piqer: Nechama Brodie
Saturday, 05 May 2018

Data Violence And How Bad Engineering Choices Can Damage Society

The intersection of algorithms, data, ethics and humanities is, in many senses, the real coal face of the supposed fourth industrial revolution. Not whether or when our AI or machine learning programmes will achieve some next tier, or the size of our computer chips, or supercomputing efficiencies, but what it all means to us, the makers of these things. Hard-coded data patterns are going to shape and unshape us in much the same way as new highways bypassed some towns and put others squarely on the map. 

Many commercial algorithms — which harvest or access data (often about us) or both — have two distinct areas of opacity: one is how they operate, the 'black box' problem, where the inputs and outputs are seen but the workings are obscure; a second, equally important unknown, is what is the potential impact of the (real or hypothetical) outputs. Negative impacts range from outrageous to subtle: What might happen if facial recognition software is used to 'determine' whether a person is gay or not? What happens if black men with tattoos are labelled as potential gang members? What happens if being related to someone who has been convicted of a criminal offence will make it harder for you to get accepted to college or get a bank loan?

When asked about the impacts and potential dangers of utilising these pieces of code, the common response of many of those who work(ed) on the data and algorithm side is to say: 'I don't know. I'm just the engineer'. But this article, by an assistant professor at Washington University, argues that it is no longer good enough for coders to only worry about their code executing perfectly or correctly — it is now all of our jobs to be as transparent as possible, and not only acknowledge the 'data violence' as she calls it (her phrase for the inherent and inadvertent harm done through unthinking or thoughtless applications of data programmes), but to engineer strategies to force us to do better as we progress. 

Data Violence And How Bad Engineering Choices Can Damage Society
8
3 votes
relevant?

Would you like to comment? Then register now for free!