‘Coded Bias’ and the transformative potential of our watch-lists

by Anna Ireland

Over the last few months our indoor spaces have, for many, become our entire worlds; solitude soothed by lives explored on phones, laptop screens and devices. With colour and sound infusing the darkness of our indoor-landscapes, movies, TV shows and now-streaming plays have become our windows to the world, in all its complicated glory.

As we consume more and more global stories to the backdrop of a worldwide pandemic and social justice movements gain rapid momentum on social media – in part due to police brutality filmed on smartphones – the power of visual media to both distract and impact change is evident. And, as police officers brought to justice thanks to a multitude of internet-voices have highlighted, action and technology are linked arm in arm. Our visceral reactions to injustice have been relayed, replayed and reported to affect concrete and structural change.

This symbiotic relationship has emphasised the benefits of engaging with, watching and learning from others on screen. The circulation of films and documentaries as part of the Black Lives Matter movement exemplifies the ways that visual media can feed into wider cultural narratives and our understanding of what the world, in this very moment, requires of us. The transformative potential of our watch-lists has, arguably, never been greater.

However, our relationship with the technology we rely on is complicated. On the one hand, we can use it to educate ourselves and others, or bring us comfort and joy (after all, only a soothed soul can commit to the task of challenging social injustices). On the other, it can show us things that may be difficult to metabolise or know how to act upon. On a more insipid level, the platforms we are watching and learning from, and who controls this, work via mechanisms often invisible to us as technology facilitates our engagement; it is no innocent bystander in this dynamic.

How, then, can our relationship with technology be harnessed to further impact change? And how can we better understand our devices so that we are controlling them, instead of them controlling us?

The latter question is a central concern of Take One Action’s 2020 Festival opener, feature documentary Coded Bias. Examining the bias that infiltrates our lives through Artificial Intelligence (AI), and the algorithms that constitute it, the film taps directly into our complicated relationship with technology and shines a light on its invisible ability to manipulate our consciousness and lives. It poses the big question: what bias is implicated in these algorithms, and how can we make it work for us?

Pulling back the curtain on the threats that AI poses to civil rights and democracy, Coded Bias follows MIT researcher, poet, computer scientist and Algorithmic Justice League (AJL) founder Joy Buolamwini, as she uncovers racial and gender bias in AI systems sold by large technology companies. To enhance her chance of being recognised by these facial recognition system algorithms, she literally has to wear a white mask.

The documentary traces the creation of algorithms in small, homogenous groups of (mainly white) men to a coding structure that replicates the prejudice of this group in entire systems that affect the daily lives of millions of people. Algorithms form the backbone of most of our technology, information systems and many decision-making processes. From pay reviews to parole hearings, algorithms have been created by humans to think so that we don’t have to. A huge downside? Empathy. The objectively ‘right’ thing is not always the compassionate thing – a measurement computers don’t possess – particularly in cases where individual’s lives have deviated from the path an algorithm would have predicted. As one middle school teacher testifies, he’d been nominated for countless ‘Teacher of the Year’ awards before being deemed by the computer system as incompetent. The case was brought before court, eventually allowing him to keep his job.   

Through Buolamwini’s advocacy for everyday people harmed by big-tech, we witness the ways that we can fight for technology that fights for us. The goal? A future where social technologies work for the many, not the few. It’s a mantra not too dissimilar to the chant of worldwide social justice movements centring the needs of the everyman above a privileged majority. As Coded Bias highlights, the privileged majority will replicate their own bias within the technology they create. Tech is, after all, made by humans, thus human prejudices are embedded within their structure and replicated to a large scale.

How, then, can we harness technology’s positive potential whilst understanding its unseen limitations? How do we make space for bettering our communities amidst this, and carve out our own path of learning, when our technology may be working against us?

As Coded Bias and Buolamwini’s activism reminds us, we can take small steps to act against greater forces, bettering our communities by acknowledging the differences inherent within them, and building our technology structures from there. We are reminded of the power of our own potential to learn and act. We are small cogs in a much larger, ever-moving wheel; how, why and by what medium we choose to move is important.  

Visual media has a key role to play; witnessing lives beyond our immediate surroundings can be a reminder of the immensity of the world and the small slice our reality constitutes within it. They may encourage us to feel less stagnant as we recognise the ever-changing potential of circumstance. As passengers in the journeys of others, we may see a shift in our own situation as possible. That shift could be a thought, a movement, an action. And action can propel us all towards a better future.

Watch Coded Bias at 6:45pm on Wed 16 Sept: http://bit.ly/TOAFF20_CB_e

Join us for the live Q&A with the filmmaker and other special guests at 8:30pm on Wed 16 Sept: https://bit.ly/TOAFF20Coded

Facebook event: http://bit.ly/TOA_CB_fb