WMC News & Features

‘Coded Bias’ Shows How Women Are Leading a Civil Rights Movement in Technology

Wmc features Joy Buolamwini in Coded Bias Photo courtesy of 7th Empire Media 120220
Algorithm Justice League founder Joy Buolamwini in Coded Bias (Photo courtesy of 7th Empire Media)

Some people might be surprised to find out that in the male-dominated world of computer technology, many women are leading the grassroots groups that are fighting against bigotry and the abuse of power. The compelling documentary Coded Bias, directed by Shalini Kantayya and released in virtual cinemas, shines a spotlight on this high-tech civil rights movement. But this movement isn’t just about fighting against gender discrimination. It’s also about preventing violations of civil rights for everyone, and creating more awareness of what people can do to protect their rights.

“I didn’t set out to make a film that was predominantly women, but my research kept leading me back to all these brilliant, badass women,” Kantayya tells Women’s Media Center. “And what I realized was there was this untold story … this unheard genius. What I learned was that the people leading the fight for ethics in A.I. [artificial intelligence] are women and people of color, which is why they’re featured so prominently [in Coded Bias].”

Technology activists who are featured in the film include Joy Buolamwini, founder of the nonprofit Algorithm Justice League (AJL); Silkie Carlo, director of the civil liberties group Big Brother Watch UK; Weapons of Math Destruction author Cathy O’Neil; Baroness Jenny Jones, a member of the U.K. Parliament’s House of Lords; Automating Inequality author Virginia Eubanks; Partnership in A.I. research fellow Deborah Raji; Algorithms of Oppression author Safiya Umoja Noble; Artificial Intelligence author Meredith Broussard; Twitter and Teargas author Zeynep Tufekci; and futurist Amy Webb, author of The Big Nine.

In this day and age, it would be difficult to find anyone in a developed country who hasn’t been recorded on a surveillance video, hasn’t used a cell phone, or hasn’t been on the internet. All of these experiences allow various entities to collect data that can then be used to violate privacy or commit illegal discrimination. Coded Bias focuses on facial recognition technology and the slippery slope involving the abuse of power when it is used as a basis for racial and gender discrimination.

Imagine not being granted access to a computer program because the computer program rejects your skin color. That’s what happened to Buolamwini, when she was a Ph.D. candidate at the MIT Media Lab. She created the computer-based Aspire Mirror (a mirror device that allows people to superimpose images on their faces), and she got facial recognition software that was supposed to track her face in order to use the Aspire Mirror.

Buolamwini, who is African American, was shocked to find out that the computer software couldn’t read her face and therefore wouldn’t let her access the Aspire Mirror. As an experiment, she put on a white mask when using the software, and her access was immediately granted. In Coded Bias, she demonstrates how the software discriminates according to skin color, and it’s one of the most impactful scenes in the documentary.

This incident led Buolamwini to further investigate why her skin color was being rejected by this software. She found out that the software was programmed to detect faces of mostly white men or people with light skin tones. In the documentary, Buolamwini says that she came to the conclusion that because white men are the majority of computer programmers and highest-ranking technology executives, there’s a bias that incorrectly assumes that most people using computer technology will also be white males.

Buolamwini went on to found AJL to address issues of discrimination in computer technology. In 2019, she testified about these issues in a U.S. House of Representatives hearing on facial recognition technology. (Coded Bias includes footage from this event.) This high-profile congressional hearing, as well as the work of AJL and other activist groups, have helped create more awareness and pushed for more legislation and business responsibility to prevent discrimination in technology. In June, a Democrat-sponsored bill was introduced in the House to limit law enforcement’s use of facial recognition technology.

Coded Bias reveals how AJL and other civil rights groups help sound this alarm: The fact that something (such as an algorithm) is computerized doesn’t mean it’s neutral and can’t discriminate. Anything created by people can include the creators’ prejudices in its structure and therefore in the end results. The biases baked into this technology can result in racial and gender discrimination that can affect people’s employment, housing, and access to resources such as health care, financing, insurance, legal services, and education.

The documentary includes interviews with New York City tenant activists Icemae Downes and Tranae Moran, who helped lead successful protests against plans by their landlord (Nelson Management Group) to replace tenants’ keys with facial recognition technology at Atlantic Plaza Towers, an apartment building in Brooklyn. The majority of the Atlantic Plaza Towers residents are Black, yet the documentary pointed out this facial recognition technology was not being planned for Nelson Management Group apartment buildings where most of the residents are white.

Coded Bias also mentions Amazon being exposed for using an A.I.-based recruiting tool that discriminated against female job applicants. Amazon says that in 2017 it disbanded the team of developers who made this software and claims to no longer use the tool.

Coded Bias shows the work of Big Brother Watch UK, a United Kingdom civil liberties group that fights against discriminatory surveillance practices that violate privacy and other civil rights. The documentary includes footage in London of Carlo and other people on the Big Brother Watch UK team scolding plainclothes police officers who detain, question, and sometimes fingerprint people on the streets because of data that the police collect from a nearby unmarked video-surveillance van. The van is equipped with facial recognition technology, which the police use during street surveillance to match people’s faces with criminals in their database. It’s legal to make recordings of people on a public street without their permission, but the privacy concern is with how those recordings are being used.

One of the officers admits on camera that the facial recognition technology is very faulty and often leads to mistaken identity, by incorrectly matching the faces of criminals with people who are not those criminals, as was the case with an innocent Black teenage boy whose interrogation and fingerprinting on the street were caught on video and shown in the movie. Big Brother Watch UK workers also distribute flyers to people, such as the detainees (whose faces are blurred out in the documentary), with information on where to get help if they think their rights were violated.

Kantayya comments: “I had to go to the U.K. for that [footage], because here in the U.S. … there’s no transparency [for this kind of police surveillance]. In the U.K., they have the General Data Protection Regulation [GDPR], so I could get that footage.”

The GDPR sets privacy guidelines for data collection and processing of information from individuals who live in the European Union. Members of the media are allowed greater access to investigate how this data is used in GDPR countries, compared to media in other countries. Although the U.K. officially exited the European Union in 2020, the GDPR is still the law in the U.K.

In June, three of the world’s largest big-data companies — Amazon, IBM, and Microsoft — announced that they would stop selling facial recognition data to law enforcement. But Coded Bias makes it clear that there’s still more work to be done in technology-related civil-rights issues.

Although Coded Bias was completed before the COVID-19 pandemic, Kantayya says that issues over A.I., algorithms, facial recognition technology, and gathering of personal information will play a role in COVID-19 contact tracing: “What I learned that’s so terrifying is how we are already outsourcing such powerful decisions to these automated gatekeepers that are deciding human destinies. These systems are not vetted for bias.”

She continues, “When Google and Apple swoop in as the white knight to say, ‘We’ve got this contact tracing thing. We just need to know everywhere you’ve been and everyone you’ve associated with,’ I think we should be really worried … I’m not sure a high-tech app is actually the solution.”

Kantayya says, “In making Coded Bias and re-centering the voices of women, I hope that we can unleash a new kind of imagination in what these technologies can be, instead of just invasive surveillance tools.” But she adds that it’s about more than featuring certain activists in a documentary: “We need systematic change.”



More articles by Category: Arts and culture, Media
More articles by Tag: Activism and advocacy, Film, Human rights, Technology
SHARE

[SHARE]

Article.DirectLink

Contributor
Categories
Sign up for our Newsletter

Learn more about topics like these by signing up for Women’s Media Center’s newsletter.