Designing AI with Justice

I will discuss three concepts in this talk: first, the idea of design justice; second, how people are already resisting oppressive AI; and third, the ten principles of design justice ...

The following is a lightly edited transcript of remarks delivered at the Co-Opting AI: Justice event, held on May 13, 2019.

I will discuss three concepts in this talk: first, the idea of design justice; second, how people are already resisting oppressive AI; and third, the ten principles of design justice.

Immediately after this talk, I have to head directly to the airport. At the airport, like you, I have to go through the millimeter wave scanning machine—that’s the one that spins around you while you put your hands up. Now, what comes next a lot of cisgender people don’t necessarily know, but most trans and gender-nonconforming people do.

When you approach the millimeter wave scanner, the TSA agent looks at you and decides whether they think you’re a boy or a girl. If they think that you’re a boy, they press the blue “boy” button; if they think you’re a girl, they press the pink “girl” button. Then the millimeter resolution scan of your body is compared to a statistical model of an average male or female body. Anything on your scan that diverges from that model is flagged and highlighted in bright yellow with a little red outline.

“‘Anomalies’ highlighted in millimeter wave scanner interface” (2016). From “Traveling while Trans: The False Promise of Better Treatment,” by Dr. Gary Gabriel Costello

So, for people like me—for gender nonconforming people and for many trans people—you can’t win. If I get selected female, I’m going to have part of my body highlighted; if I get selected male I’m going to have another part of my body highlighted—all based on the way that my body diverges from a binary model of what human bodies are supposed to look like.

Now, this means that cisnormativity—the idea that everybody is born with a gender identity and presentation that conforms to the sex that they were assigned at birth—is reproduced in the sociotechnical system we call airport security. Such a system includes all of the actions detailed in the security protocol, the model that the TSA officer is following, the machine itself, the statistical model (which is a binary based on the average dimensions of some group of male or female bodies), and so on and so forth.

For me, my white skin, my educational privilege, and my position in a powerful institution are all probably going to protect me from the worst harms of being classified as an anomaly. I probably won’t be pulled out of security and hooded and whisked off to Guantánamo Bay or another of the secret prisons in the US’s global so-called war on terror. I probably won’t be pulled out for extended questioning, as many other people—who might have been mistakenly placed on the terrorist watchlist, or might have a Muslim last name or brown skin, or might inhabit any number of possible identities unlike mine—will be. Because I have certain kinds of privileges, I probably won’t even miss my flight.

Yet I use this example to talk about design justice, because it’s a great way to think about AI. (When we say AI, we’re usually just talking about machine learning, which is a technique for taking existing data sets and models that we have of how the world works and using them to predict things about the future.) Airport security is a great way to think about how those systems can’t just be fixed by having more inclusive data sets; such a solution, it becomes clear, actually limits the scope of our vision of the possible worlds that we might want to build. Part of why this approach is wrong is because it is a quick-fix way of thinking about the harms of these systems, and one that ignores history and deep structures.

In fact, misgendering and cisnormativity aren’t things that were just created with the rise of AI systems. Misgendering and cisnormativity, in the Americas and much of the rest of the world, were imposed over hundreds of years of violent settler colonialism. An image of Spanish conquistador Vasco Núñez de Balboa depicts him having his dogs devour the bodies of indigenous Quarequa third-gender people, whom he saw as gender-nonconforming people (men in women’s clothing). Cisnormativity, brutally engrained over centuries, is not going to be fixed by a more inclusive data set.

If we try to imagine worlds that we want to build, certainly design has to be part of those worlds.

So, what can we do? I’m part of a community of people working out how the design of sociotechnical systems can be one means of dismantling or transforming systems of oppression. We are designers and developers who think of ourselves as part of the Design Justice Network.

Design justice is a growing community of practice focusing on the equitable distribution of design benefits and burdens: what we might call the fairness aspect of the equation. But we also focus on supporting meaningful community participation in design decisions, as well as recognizing community-based, indigenous, and diasporic design traditions, knowledges, and practices. (This is what Arturo Escobar talks about in his new book, Designs for the Pluriverse: thinking about how we might design in ways that respect many different ways of knowing, understanding, being, and making the world.)

In addition to a community of practice, design justice is also a framework for analysis. As a framework, it allows us to question how the design of sociotechnical systems influences the distribution of benefits and burdens among various groups of people. Specifically, design justice talks about how design reproduces or challenges what black feminist sociologist Patricia Hill Collins calls the “matrix of domination”: the intersection of white supremacy, hetero-patriarchy, capitalism, ableism, settler colonialism, and other forms structural and historical inequality.

So, design justice goes beyond fairness. It entails thinking about the matrix of domination—about intersecting systems of oppression—and what it means to design sociotechnical systems that can transform or overturn these systems, rather than constantly reproducing them in technology, in design, and in machine learning. Because if we try to imagine a world that we want to build—one into which many worlds fit, one where we’re included, and one which is also ecologically sustainable—certainly design has to be part of that world.


Co-Opting AI

By Mona Sloane

But even though design could be one key to our collective liberation, the design of AI, or machine learning, is still deeply inequitable. It’s inequitable in terms of: who gets to build it; who the paid AI workers are; who the imagined users are; the goals of the systems; the sites in which we’re building these things; the power relations that these systems support and strengthen; the pedagogy that we’re using to teach the people who are learning how to build these systems in computer science departments around the country and around the world.

The benefits of design typically flow to the more powerful people within the matrix of domination, while the harms fall on those who are less powerful. Design justice seeks to expose these flows and address them.

And design justice is also about the design process, the actual building of things—because, as a designer, I am not ready to give up on machine learning. There are many times when my colleagues and I want to just say: “No, we don’t want to build these things.” But I’m not willing to say that we never want to build AI systems for any reason.

So, if we’re going to be building, for example, an x-ray system that’s powered by AI, how are we going to build it? Design justice says that the design process needs to be rethought. Look to the accompanying chart.

Image courtesy of Sasha Costanza-Chock

The pink dots are experts with technical knowledge of how to build a particular system, those whom current design processes center on. The blue dots around the edge are the people whose lives will be impacted by the system.

Image courtesy of Sasha Costanza-Chock

But design justice says that the process should look more like this—a process in which there’s still experts with technical knowledge involved in the system production, but which is centered on the people whose lives are going to be most impacted. The experts with technical knowledge take part in supporting and amplifying and building upon the needs of the most impacted communities, which contain people who have other kinds of expertise, who have lived experience—like my lived experience of how these millimeter wave systems fail.

The Design Justice Network was born in 2015, at the Allied Media Conference in Detroit. Designers, organizers, technologists, and activists gathered and explored how to do design justice work, sharing knowledge about what their design practices look like. Many have subsequently been influenced by design justice principles. For example, see the Consentful Tech Project, which Una Lee and the And Also Too design shop are working on to think about how we might apply ideas about sexual consent to the process of designing technologies. Consent is freely given, reversible, informed, enthusiastic and specific (FRIES); most of our AI systems are none of those things.

Image courtesy of Sasha Costanza-Chock

The #MoreThanCode project was influenced by these principles as well. For this project, we interviewed over 100 technology practitioners who are trying to do public-interest technology or community technology work. We asked what works well and what doesn’t. The top takeaway from that work was “Nothing About Us Without Us,” which is, of course, the slogan from the disability justice movement. It means: those who are most affected need to be centered in the design process. And not just in a consultative way, but in a way that allows them governing power or that ensures some accountability.

We need to rethink design sites, too. That is the idea behind the DiscoTechs (Discovering Technology Fairs), a model developed by the Detroit Digital Justice Coalition that is sort of like a hackathon, but much more inclusive and community friendly—there’s food, there’s childcare. The idea is to create open spaces in which to think about how we design the futures that we want to have.

I teach a class at MIT that is called the Co-Design Studio, where I’m trying to figure out how to teach and learn together with students to do this type of work, to do design justice in practice. We codesigned graphics for the Electronic Frontier Foundation Surveillance Self-Defense tool kit. Another student project works with the Chinese Progressive Association to think about how to produce publicity materials about the importance of data disaggregation in the upcoming census that could be circulated in WeChat groups.

Resistance is fertile. Even though I started this talk with a pretty intense story and spoke about hundreds of years of oppression, we need to give our communities more credit for understanding what’s happening, and for organizing and mobilizing against it. Because wherever there’s oppression, there are communities resisting and people coming up with strategies to survive and to change oppressive systems.

“Atlanta Plaza Towers residents protest doorway facial recognition,” courtesy of Brooklyn Legal Services

At the moment, for example, over 180 residents of Atlantic Plaza Towers in Brooklyn have risen up to say that the building owners are not going to be able to install a face lock system that would analyze people entering the building. We have to show how these systems are unfair—in fact, for new systems like this, we don’t even know how they are unfair yet. But that analysis is part of a larger strategy of saying no: saying, no, we don’t want them at all.

There’s a race right now to see which city will be the next to ban facial recognition technology use by law enforcement. The City of Somerville has banned law enforcement’s use of these technologies. San Francisco passed a ban, as did Oakland, among others. We should figure out a way to give a prize to cities who do this right and encourage more action at the city level. In the age of Trump, we have to look for municipal actions.

We’re living through an incredible moment in the tech world right now: you can find out a lot more under the hashtag #techwontbuildit. Under this umbrella, you can find Google workers successfully organizing to force Google not to take the Project Maven contract, a drone AI contract which was actually a test run in order to secure JEDI, the Department of Defense’s massive cloud computing contract. We have the #NoTechforICE campaign: Mijente, a national organizing hub for Latinx and Chicanx populations, has held demonstrations outside Palantir offices in Manhattan and DC to protest Palantir’s collaboration with ICE to round up, detain, and deport undocumented people.

A new study in the UK of more than 1000 tech workers found that about 60 percent of those working on AI systems said they’ve worked on products they felt might harm society. This is terrible news, but 27 percent of these respondents had quit their jobs as a result. And that, to me, is encouraging; it means that #techwontbuildit is not just a flash in the pan and is not just marginal.

Other initiatives also give me hope—like platform cooperativism, which tries to build platforms whose business model isn’t just worker exploitation and data exploitation and extraction. Here in New York, for example, you’ve got Up & Go, which is a platform co-op to link employers up to cleaners from worker-owned cleaning companies. The platform is also making determinations about how less oppressive and exploitative data flows about workers and clients are going to work.

I will end by laying out the Design Justice Network principles, which those of us in the Design Justice Network are trying to activate:


Principle one: We use design to sustain, heal and empower our communities, as well as to seek liberation from exploitative and oppressive systems.


Principle two: We center the voices of those who are directly impacted by the outcome of the design process.


Principle three: We prioritize design’s impact on the community over the intentions of the designer.


Principle four: We view change as emergent from an accountable, accessible, and collaborative process, rather than as a point at the end of a process.


Principle five: We see the role of the designer as a facilitator rather than an expert.


Principle six: We believe that everyone is an expert based on their own lived experience, and that we all have unique and brilliant contributions to bring to a design process.


Principle seven: We share design knowledge and tools with our communities.


Principle eight: We work towards sustainable, community-led and -controlled outcomes.


Principle nine: We work towards non-exploitative solutions that reconnect us to the earth and each other.


Principle ten: Before seeking new design solutions, we look for what is already working at the community level. We honor and uplift traditional, indigenous, and local knowledge and practices.


You can sign on to these principles at Thank you.


This article was commissioned by Mona Sloaneicon

Featured image: “Banksy, Marble Arch, London” (2018). Photograph by Niv Singer / Unsplash