Courtesy Coded Bias
“The past dwells in our algorithms,” says Joy Buolamwini, a researcher at the MIT Media Lab and protagonist of Coded Bias, Shalini Kantayya’s new documentary on the widespread use of artificial intelligence in our everyday lives and the racial and gender biases baked into its algorithms.
The film kicks off with Buolamwini’s accidental discovery of bias, in commercially available facial recognition tools that couldn’t recognize her dark-skinned face until she put on a white mask. Kantayya builds a far-ranging examination of automated decision-making tools already deployed in our institutions and illuminates the threat to civil liberties across the globe, from an apartment complex in Brooklyn to the streets of London to government-sponsored surveillance in China.
An official selection at the Sundance Film Festival earlier this year, Coded Bias has been compared to An Inconvenient Truth in breaking down the diffuse issues of surveillance, data mining, and machine learning. Variety calls Coded Bias both a “wake-up call” and “a call to action,” and The Hollywood Reporter hails it as “a chilling plunge into Orwellian reality.” The film releases nationally in November.
Kantayya spoke with me via Zoom about the making of the film, why algorithmic justice is an urgent civil rights issue, and how AI can become more just.
This interview has been condensed and edited for clarity.
Reena Shah: I understand that you love tech and also are attracted to how tech influences inequality. How did these two impulses influence Coded Bias?
Shalini Kantayya: I’m a science fiction fanatic, I love imagining the future of tech, and tech is part of my practice as an artist and my fascination as a human being. All of my work has to do with how disruptive technologies impact equality and whether they make the world more or less fair for marginalized communities, including people of color and women. My first film [A Drop of Life] is based on a true story about a woman who puts a prepaid system on a village water pump in India, so you can only get water with a prepaid card. And Catching the Sun is about how small-scale residential solar can uplift the working class and the economic and innovation opportunities in building a clean-energy economy. With Coded Bias, I discovered the work of two of the women in my film, Joy Buolamwini and Cathy O’Neil, the author of Weapons of Math Destruction, and fell down the rabbit hole of the dark underbelly of Big Tech.
RS: The film begins with Joy Buolamwini’s discovery about facial recognition technology and then quickly, and startlingly, expands into a broad range of surveillance technology and automated systems in our daily lives. Was that your intent?
SK: I definitely did not know that narrative before I started. What I discovered was Joy Buolamwini, who was trying to make an art project work by making a Snapchat filter recognize her face. She stumbled upon this crazy thing: that facial recognition, already deployed widely and to a large part secretly, without an elected official approving it, had serious racial and gender biases. Through Joy’s story, I follow how automated systems have become invisible gatekeepers of opportunity, deciding who gets health care, who gets hired, who gets into universities, who gets parole, who shows up on your feed.
Facial recognition is in some ways the easiest algorithm to understand because it’s so invasive and because of surveillance cameras. But I think with the research of so many brilliant and badass women in my film, the story is able to approach more invisible, opaque algorithms.
RS: What were some of the challenges in making the issue of bias in tech come alive for the general public?
SK: Well, I couldn’t talk to people for two years. I’d go to a party and someone would ask, what are you doing, and I’d say, “I’m making a film about how robots are racist and sexist.” It was really hard to explain. There are seven Ph.D.s in the film. I was talking to so many smart women and trying to figure out how to weave the story together.
What I learned in the making of Coded Bias was that data rights really intersect with human rights and civil rights. What I’m trying to do in the film is bridge a gap in scientific understanding. We see with COVID-19 that understanding, translating, and acting on science are really important to civic engagement. I feel that the geniuses in my film, like Joy, do a really good job of showing why the science matters and to whom it matters most.
Courtesy Coded Bias
Computer scientist and digital activist Joy Buolamwini
RS: What was the moment, in making this film, when you felt most concerned, and perhaps most overwhelmed, as a citizen?
SK: First, I’m not just saying this, but I do feel a tremendous amount of hope. I watched the research of Joy Buolamwini to Ravi Naik [human rights lawyer] to Deborah Raji [tech fellow at the AI Now Institute at New York University] change the world. In June, IBM said that they would get out of the game, stop researching, deploying, and selling facial recognition. Microsoft said it would stop selling to police, and Amazon said it would take a one-year pause. Of all the outcomes that I imagined at the beginning of the journey, I never expected this kind of sea change. Perhaps it’s only a gesture, but I do feel that we have a moon shot moment to shape these technologies of the future.
Right now, all of the power is on one side and that’s not healthy for democracy. For instance, if you go to a protest and police take a picture and load it into a system and pull up your whole social media profile, do you have freedom of assembly? If you post something on an algorithm that’s criticizing Facebook, and Facebook places it on the bottom of their feed, do you have freedom of speech? There are so many things that we haven’t thought about because we lack a basic understanding of how these systems work. What’s most alarming for me is how widely these systems are already being deployed. We’re outsourcing our decision-making to machines that have biases that can hurt people and have real consequences for people’s lives. If the public knows more, we can put pressure on Big Tech to do the right thing. The technology is still wet, like the cement is still wet, and we can still imprint our values on these tools. But we need balance and we need laws.
RS: There are several organizations, including the Algorithmic Justice League, pushing for a kind of FDA of tech, a regulatory body.
SK: I saw essentially three approaches [by governments] to data. There’s China, which is like a Black Mirror episode inside the film in which surveillance is literally part of grocery store purchases and vending machines. There are the Europeans, who have a framework around data rights being human rights.
Then we have the U.S., which is a Wild West. Here, tech is not regulated like other sectors of society. I’m hopeful that we’ll start to question, like Cathy O’Neil says, “the blind faith that we have in big data” and that we can enact a balance of power and advocate as citizens to protect our civil rights.
I saw in the making of my film a 14-year-old Black British boy in school uniform wrongly identified by the police. There’s a particular danger when these racial biases are weaponized with the power of the state. When Joy spoke to Congress in 2019, Jim Jordan, Republican of Ohio, said, wait a minute, 117 million Americans are in a police database and no one in elected office made a decision on that? That’s a cause for alarm, and he looked as terrified as the Democrats.
I think the current sea change owes a debt of gratitude to the people in the streets. That movement has changed how people view systems of oppression and systematic racism and sexism. In June, we got the first-ever legislation on the table that would ban federal use of facial recognition. Hopefully, it’s the beginning of a conversation. I’m hoping the film offers a way to talk about these issues and connect them to the things that people care about. As we peel back the curtain on the Wizard of Oz, on Big Tech, we can see it for what it is. It’s made by flawed people coding their own biases into these algorithms. Knowing this can change the way we think about technology.
RS: In the film, a young woman in China discusses the benefits of surveillance and the social order it helps her make sense of. It’s a scary moment. How does this compare to the U.S.?
SK: She loves Big Brother! And we all do a little bit. People [in the U.S.] talk about that idea like it’s far away. But how many times have you looked at someone’s Twitter or Instagram followers or Facebook likes and made a judgment about them? We are all being scored all the time in different ways. We are engaging in what Cathy [O’Neil] calls algorithmic obedience training. We’re all being trained to be obedient based on our score.
The reason for the scene in China is to illustrate that we have no protection against that happening in the United States. We don’t have laws that would prevent large-scale, invasive surveillance, either by a state or corporation.
Courtesy Coded Bias
Digital surveillance is omnipresent in China.
RS: How does our love affair with efficiency and convenience play a role? Are human rights and civil rights being trampled in the name of ease and automation?
SK: For me, that’s a Gattaca-esque future. Really we are more than our past data. We become the first people to go to college. We become larger than the sum total of our parts. But the race to efficiency denies our humanity.
I was especially happy to have Coded Bias screened at the New York Human Rights Watch Film Festival in the context of human rights. The current protests are a process of fighting for the inherent value of every human being, including Black life and inalienable rights. We’re learning that we only have as many rights as the most vulnerable among us. Like COVID taught us, we are only as healthy as the most vulnerable among us. We’re being asked to build a more humane culture. But the race to efficiency that is making a few people very rich is undermining this. Efficiency culture and greed are working against building a human rights culture.
RS: The film brings up great questions about the purpose of technology and AI. Are these technologies helping us?
SK: Well, this is the only way we can talk right now. The only way we can be together. So I’m grateful for it. But I think tech can work in ways that serve us better. I don’t think that efficiency should always be the goal. And the power of tech needs to be kept in check, like it might have to pay some taxes and obey laws. Respect small business, human rights, let people unionize.
We need tech to uphold democratic values as more of our lives go into a virtual public square. We can’t opt out of these systems. Increasingly, this is our way of communicating. There should be some [clarity] of what we’re consenting to in terms of how our data is used. We should all be suspicious when Google and Apple get together and say, we’re going to solve COVID by tracking everyone and we just need to know where you’ve been and who you’ve been with. We should all be suspicious when Google and Apple have the kind of data that makes the East German Stasi look like it had a light touch.
RS: There’s a moment in the film when Cathy O’Neil talks about the “black box” of these algorithms and that bias gets ingrained into these technologies as a feedback loop. Are the people creating these technologies unaware of these biases?
SK: I think a lot of times the biases are unintentional. And that data can be passed into the wrong hands and used in a predatory way. In the example of Amazon, they were trying to hire people and make a system that was fairer, and unknowingly the algorithm discriminated against anyone who had a women’s sport or college on their resume, because of who had been successful in the past. So unwittingly the data was predicting the future.
I think the tech industry knows this issue exists and wants to fix it. But Silicon Valley wasn’t the main audience for my film because I don’t think it should be up to them. As long as it’s up to big tech companies, we’re doing something wrong.
RS: It seems that AI needs humans.
SK: AI needs diverse humans. AI needs women. AI needs to be inclusive. Fourteen percent of AI researchers are women. I couldn’t even find stats on people of color. That’s really inexcusable. We have to find out what’s going on in terms of that pipeline. Inclusion makes technologies more innovative and it’s critical for technologies being deployed to everyone.
Technology is a reflection of ourselves. It reflects our human bias. The process of checking for bias is something we have to do continually. It’s like the conversation about inclusion. It has to happen all the time, and it makes us healthier and more competitive and humane.