Intersectional Feminism and Tech Ethics: Staying out of the Intellectual Chess Game

Madelaine Ley

November 21, 2022 | Reflections

Recently, Abeba Birhane, cognitive scientist pushing forward relational and decolonial approaches to AI, commented on Twitter: “a gentle reminder that what you consider ‘an interesting AI ethics take’ on a given use-case can be a matter of life and death for people (often at the margins of society) subjected to AI systems.”

Not always, but there are times when I’m the target audience of this tweet. On these occasions, I know I’m swept up in what I call the “intellectual chess game.” It’s when your research path is organized into moves responding to those of others. You search for gaps and holes, but often only within academia. This is an isolated terrain fundamentally shaped by the limits to modernity and colonial thinking. Quantifiable and replicable knowledge is prioritized, and intuitive and contextual knowing is often deemed unserious. Being good at navigating this game can be personally thrilling and help you meet success metrics like external funding and publications. It can mean you get excited about a certain research gap on the injustices of AI, instead of feeling frustrated at the injustices that exist in the first place.

I have fallen prey to the chess game within my own work. In the unusual position of an ethicist placed in a robotics lab, I often see untapped research areas: integrating ethical principles into robotic software; a phenomenological account of spatial awareness in robots; an ethnography of my position alongside the engineers. These are all interesting subjects, but the drive behind picking research lines can be one of extraction. It may not be rooted in an urgency for the project, but rather in an intellectual interest or a funding possibility.

In such situations, I have found intersectional feminist approaches to technology ethics to keep me from floating too far into abstraction. Intersectional feminism emerged in the 1980s in response to the mainstream feminist movement, as thinkers and activists like bell hooks, Kimberlé Crenshaw and Patricia Hill Collins pointed out the should-be obvious: the path to liberation is not the same for all women. In fact, the emancipation of white, middle, and upper class women is often made possible by the labor of black and brown women. The matrix of domination, a concept developed by Collins, is unique for each person according to a given context, including their class, race, gender, sexual orientation, and abilities.

Technology, like mainstream feminism, is not emancipatory for everyone. This reality is made clear by the work of scholars like Safiya Umoja Nobel, who describes how typing in “why are black girls so” into a search engine leads to autosuggestions such as “angry, loud and mean.” Adrienne Williams, Milagros Miceli, and Timnit Gebru discuss the exploitative labor practices required to develop and sustain AI systems. Similarly, Alexander Monea articulates how the infrastructure of the internet tags queer content as adult content, thereby systematically sexualizing and marginalizing LGBTQ people in our increasingly digitized world.

These scholarly resources are important to me because they illuminate oppressions I don’t personally experience. In my closer environment and social relations, I encounter other examples of technological discrimination and injustice. I have sat with a friend grieving the loss of their place on a social housing list because they misunderstood the online application form. I waited for six months with an addicted family member to get an appointment notification that is algorithmically-generated. I was enraged at hearing of another family member driving 12-14 hours a day to provide home delivery services, with no pay raise to compensate for the growing workload. At the root of these instances, among other societal factors, are technologies and cultures that allow for automated social care and prioritization of profit over well-being in the retail sector. This last example motivates my own work, where I show that industrial and retail robots don’t necessarily alleviate burdens of labor, as purported by many technology companies, but in fact often contribute to the dehumanization of workers.

These realities emerge from dynamic systems of technology, culture and power. Changing these systems will require intersectional research that shifts technology design, integration, and use. While the focus of this scholarship is often on socio-economic injustices, I believe it should be equally motivated by a deeply rooted reverence for the innate value of each person. This might begin with the practice of situating knowledge, a la Donna Harraway, which gives a detailed account of one’s own perspectives and the context of study. But what I suggest goes on step further by involving the heart. Perhaps, then, poetry is better suited to explain. In the words of Robin Morgan: “Hate generalizes, love specifies.” Scholars are rarely taught to include love and reverence in their work, likely because these aspects of human experience elude the measurements of science. Nonetheless, I find them necessary to stay tethered to humanity in my work and out of the chess game.

—-

Join TU Delft’s Intersectional Philosophy of Technology reading group to learn more. We gather once every two months to do some collaborative thinking on a particular text. Find more up-to-date information on our twitter page: @femphildelft.

On December 8th, 15h-16:30h, we host a hybrid event with the TU Delft TPM AI Lab where Abeba Birhane will speak on relational and decolonial approaches to AI.


Madelaine Ley is a PhD Candidate in Technology, Policy and Management at TU Delft. Trained in philosophy and science and technology studies, Ley’s primary research takes a feminist and care ethics approach to discourse on the future of work, focusing on workers’ embodiment, emotions, and sociality.



Published: 11/21/2022