Daily Local News (West Chester, PA)
Cheyney screens ‘Coded Bias,’ hosts discussion with director
Cheyney University held a screening of the Sundance film “Coded Bias” and followed it with a discussion with the film’s director, Shalini Kantayya, who spoke about how the area of technology is the 21st century battleground for civil rights.
“Coded Bias,” a nominee for the Grand Jury Prize at the 2020 Sundance Film Festival, follows the yearlong journey of Massachusetts Institute of Technology graduate student Joy Buolanwini as she uncovers bias in facial recognition, algorithms and artificial intelligence in software at MIT to an apartment complex in Brooklyn, N.Y., to a teacher’s evaluation in Houston, Texas.
Buolanwini’s quest takes her from interactions with Big Tech, where IBM invited her and corrected their technology, others dismissed her and Amazon tried to discredit her work, to the halls of the U.S. Congress to testify at a 2020 hearing that resulted in lawmakers introducing legislation to ban facial recognition. There is not yet any federal law that bans it although some states, such as Illinois, Texas and Washington, have passed laws that limit its use.
“We could essentially roll back 50 years of civil rights advances in the name of trusting these black box algorithms that we can’t question, we don’t know how they work, trusting them to be neutral when they’re not neutral,” Kantayya said. “This is the undone work of the civil rights movement. The fight ahead is going to be on this level of big tech.”
Kantayya explained that AI is on the precipice of being a $150 trillion industry.
“It’s transforming everything that we love in a democracy - fair housing, fair elections, fair employment,” she said. “I didn’t really realize the ways in which algorithms, machine-learning AI, is increasingly becoming a gatekeeper of opportunity and increasingly making decisions about who gets hired, who gets into college, what quality of healthcare someone receives, what kind of credit you get on a credit card, what communities get undue police scrutiny, how long a prison sentence someone serves.
“I don’t think I really realized the way human beings are outsourcing our autonomy and our decision-making to machines in ways that really govern human destiny,” Kantayya said.
She added that these systems have little oversight.
“These same systems that we’re trusting so implicitly have not been vetted for racial bias or for gender bias or that they won’t discriminate, that they won’t do unintended harm, that they won’t hurt people,” Kantayya said. “In some cases, these systems that we’re trusting haven’t even been vetted for some shared standard of accuracy aside from the company that seeks to benefit economically from its deployment.”
Bias, she explained, is not something that is just in a few bad people.
“It’s an inherent, innate condition that we all have and that we’re unconscious of, regardless of our upbringing or where we come from or our socio-economic backgrounds,” Kantayya said. “So that means we have to constantly be vigilant about ... our own biases. Joy talks about how it’s like hygiene, you can’t just do it on one day and say, ‘We did that.’”
And, that’s why technology, its creators and its employees need to be inclusive and diverse, Kantayya said.
“We all have a place at the table because these technologies impact all of us,” the director said, adding that education can aid many in determining algorithm misuse. “I felt like that ... ‘Who am I to question this power? AI is for the smart, rich people in Silicon Valley.’ And, so, it becomes this kind of power that cannot be questioned.”
Yet, her film shows examples of when people felt that something was amiss and they sought out expertise to right the bias they were experiencing.
On the “Coded Bias” website, there is an activist toolkit that provides definitions and resources. Also, Buolanwini helped found the Algorithmic Justice League for this purpose and can be visited at ajl.org.