Documentary Review: Coded Bias

Renee M. Shelby

March 15, 2021 | Reviews
 

When was the last time you were aware you interacted with an algorithm? The politics of visibility and invisibility profoundly shape our social lives in ways we are likely unaware of. “Coded Bias” (2020) is a documentary directed by Shalini Kantayya surfacing the vast yet invisible role of artificial intelligence (AI) in the production of racialized and gendered social control. What does it mean when AI increasingly governs our civil liberties? And what are the social and legal consequences for those targeted by racist AIs? The film explores these questions and how the covert deployment of algorithms has created a new social and economic order shaping power relations: from facial recognition software to hiring algorithms to software informing health insurance decision making.

The notion that computational technologies have gendered, racialized, and classed politics is not new for STS (see, e.g., Chun, 2009; Benjamin, 2019; Eubanks, 2018; and Vertesi & Ribes’ (2019) edited collection, digitalSTS). However, the film effectively presents and advances many core STS concerns: knowledge production, networked agency, inclusion and exclusion, blackboxing, and examining digital objects and practices in a sociohistorical context. One of the film’s key strengths is how it tells its story about the technopolitics of (in)visibility through the perspectives of critical practitioners and interdisciplinary thinkers. The film includes contributions from Joy Buolamwini, Meredith Broussard, Cathy O’Neil, Zeynep Tufekci, Safiya Noble, Timnit Gebru, and Virginia Eubanks, among others—who present a clear and urgent message about how how “algorithmic determinism” shapes our lives in gendered, racialized, and classed ways. As Cathy O’Neill notes in the film, “power is being wielded through data collection, through algorithms, through surveillance.” The unchecked algorithmic control over social life grounds the film’s call for new democratic regulation and governance paradigms that can disrupt the profoundly asymmetrical power relations between those who own the code and those who do not.

Joy Buolamwini, featured in “Coded Bias”
 

The first step in shifting power relations is making visible the role of algorithms and machine learning in shaping economic, cognitive, and legal power. An algorithm uses historical information to make a prediction about the future. Machine learning is a scoring system that scores the probability of what you are about to do. Algorithms and machine learning are the invisible gatekeepers that make automated hiring decisions, assess job performance and tenure decisions, and determine how much your insurance should cost, among other high stakes decisions. Are you going to pay back a loan? Are you going to get fired from a job? How much does your toilet paper cost online? These decisions are increasingly algorithmic, and there is little transparency or accountability for those deploying predatory practices (Burrell, 2016; Ananny & Crawford, 2016).

In fact, there is no appeal system for tyrannical technologies built for extraction and social control, which the activists in the film hope to change at the federal level. The “free for all” approach to data protection in the United States has enabled a stark and troubling landscape. People in America are targeted, profiled, and surveilled in unparalleled ways. Those most likely to suffer “algorithmic harm” are the multiply-marginalized.

“Coded Bias,” as the film’s title suggests, focuses on how computational technologies not only co-produce social inequalities but encode and scale these into a new social and economic order. As Safiya Noble notes in the film, “the way we know about algorithmic impact is looking at the outcomes.” The film provides clear examples of how predatory algorithms are “optimized for failure” and what the material impact is for those people who are “bet against,” such as with the 2008 subprime mortgage crisis driven by Wall Street algorithms that led to the largest erasure of Black wealth in the history of the U.S.

In addition to showcasing how AI and data science normalize and proliferate inequality across different domains of life through opaque practices, the film highlights an often hidden digital divide. The most punitive and most invasive surveillance-focused tools go into poor and working-class families first. As Virginia Eubanks argues in the film, “if they work after being tested in an environment where there is low expectation that people’s rights will be respected, then they use them elsewhere.” These insights underscore how we cannot address mechanized discrimination and algorithmic oppression without addressing the social conditions and inequalities these technologies are embedded in.

Icemae Downes, featured in “Coded Bias”
 

Where there is power, there is resistance to power; and the film touches on politics “from above” and “from below.” The film showcases women’s activism and social movements (e.g., the Hong Kong Umbrella Movement) fighting to ensure that surveillance and other algorithmic tools are not abused. One of the critical questions raised in the film is, “How do we get justice in a system where we don’t know how the algorithms are working?” In line with the transformative politics of the documentary, the filmmaker has developed an Activist Toolkit that includes a glossary of terms and how-to start community conversations about data rights. Contestations over who participates and benefits from AI and other digital technologies are not merely about what policies and practices need to be in place to prevent abuse but about whether certain technologies should exist at all. Last summer, IBM announced it was getting out of facial recognition following police violence protests. Amazon and Microsoft announced a temporary pause on police use of its facial recognition technology. Whether or not these Big Tech companies will change course is yet to be seen; accountability requires constant vigilance. Yet, refusing data harms is also a form of justice, and the film speaks to many concerns raised in the collectively written Data Manifest-No, a declaration of refusal to harmful data regimes and commitment to new data futures.

In all, “Coded Bias” provides a sharp examination and reflection on how computational technologies work against marginalized people. The film is an accessible, engaging, and insightful resource for teaching about the relationships between technology, inequality, and justice. More information about the film can be found hereLibraries, colleges, and universities can purchase the film for their courses here, and organizations may book a screening of their own here.

“Coded Bias” will premiere on PBS on March 22nd at 10 pm (all U.S. time zones). Aalto University, Helsinki, offers a free week-long virtual screening from March 20-27 and a conversation on March 25th at 6:30 pm EET between director Shalini Kantayya and Prof. Nitin Sawhney, along with AI ethics experts and invited members of the film cast (details and registration here). 

Note about the filmmaker: “Coded Bias” is directed by Shalini Kantayya. Kantayya is a TED Fellow, a William J. Fullbright Scholar, and an Associate of the UC Berkeley Graduate School of Journalism. She directed for the National Geographic television series Breakthrough, and her debut film, Catching the Sun, premiered at the LA Film Festival and was named a NY Times Critic’s Pick and was nominated for the Environmental Media Association Award of the Best Documentary. 


Renee M. Shelby is a postdoctoral researcher with the Sexualities Project at Northwestern University and a Visiting Fellow with the Justice and Technoscience Lab (JusTech) at the Australian National University. She holds a PhD in Sociology of Science & Technology from the Georgia Institute of Technology.



Published: 03/15/2021