Now Streaming: ‘Coded Bias’ Exposes the Tech Made without Women and People of Color in Mind

Dana Kendall

Shalini Kantayya’s documentary Coded Bias premiered at the Sundance Film Festival in January as part of the U.S. Documentary Competition. The project begins streaming in virtual cinemas on November 18. Watch the trailer here, and hear what Kantayya had to say at the film’s world premiere in Park City below.


For an art project in a class at MIT, Joy Buolamwini was trying to create a mirror that would impose other faces on her own. But she ran into an issue: the facial recognition software she was using couldn’t detect her face—until she put on a white mask.

That sparked an investigation into that facial analysis software along with that of other companies. Buolamwini, who is now a researcher at the MIT Media Lab, found that this type of software performed extremely well for white male faces, but it worked significantly less well on females and people of color—with black women being at the very bottom of those performance ratings. Buolamwini discovered that the problem lay in the data sets being fed into these programs, which were overwhelmingly made up of white men.

In the Sundance Film Festival selection Coded Bias, director Shalini Kantayya followed the journey of Buolamwini and other computer scientists, data analysts, mathematicians, and activists to uncover this and many other biases that govern our technology.

Kantayya explained, “So much of what we know about artificial intelligence has been fashioned from imaginings [about the future]. When I came across this research, I realized that we hadn’t yet really adjusted our thinking about what artificial intelligence is in the now, and all of the ways that it’s impacted our lives as automated systems and invisible gatekeepers.”


More from the Festival

Michael Almereyda’s ‘Tesla’ Was Inspired by Derek Jarman—and ‘Drunk History’

Eugene Ashe and Cast Create a New Classic with Period Romance ‘Sylvie’s Love’

Back to All Stories


The problem, of course, is bigger than facial recognition used for art projects or social media filters. Algorithms can affect whether you get approved for a loan or whether you get interviewed for a job, and you may never know that it was used against you. When companies base their data on the past—for example, only feeding successful male candidates into an HR screening system—those systems will beget more homogeneity and engrain their existing biases even more deeply.

But even if you fix the issue of bias, there is still the larger question of how these systems can be used—or misused. Americans may think of China’s ubiquitous facial recognition software, government tracking, and “social scoring” system as a faraway communist dystopia, but Western governments have already begun using facial analysis in public places in an attempt to identify criminals, which brings up huge questions about civil rights violations.


Director Shalini Kantayya at a screening of ‘Coded Bias.’ © 2020 Sundance Institute | Photo by Jonathan Hickerson

“It can’t just be a technical conversation, because how these systems are used is just as important” as whether they work, said Buolamwini. “And we always have to be asking, ‘Should this exist, and should these use cases exist?’”

Buolamwini likes to think about these issues as “algorithmic hygiene. … You wouldn’t shower once in 2020 and be like, ‘Look, we took the bias shower. We’re good.’ You’re gonna stink in 2021. It’s a process, so [we need] continuous oversight.” She and many other experts like her are working toward passing federal regulations and oversight over these technologies.

When asked whether it was intentional that the vast majority of those experts appearing in the film were women, Kantayya responded, “I couldn’t find any qualified men.”


UP NEXT:

News title Lorem Ipsum

Donate copy lorem ipsum dolor sit amet

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapib.