Journalism of Courage
Advertisement
Premium

Are software algorithms racist and sexist?

A new film by Shalini Kantayya examines the biases inherent in artificial intelligence programmes and why they are lethal for civil rights.

Got My Eyes On You.

AI, Ain’t I a Woman?…Today we pose this question to new powers…as faces increment scars/ Old burns, new urns, collecting data, chronicling our past/ Often forgetting to deal with gender, race and class.

When Joy Buolamwini, a Ghanaian-American poet and research assistant at MIT Media Lab, Massachusetts, the US, chanted this spoken-word piece in 2018, inspired by African-American women’s rights activist Sojourner Truth’s “Ain’t I a Woman?” speech from 1851, she was voicing her research findings for the masses. The piece (on YouTube) spoke of what she discovered — racial bias in the facial-recognition software created by Amazon, Google, IBM and Microsoft, among others. These either didn’t recognise people of colour accurately or didn’t register them at all. Or, acknowledged women as men — even former US first lady Michelle Obama, actor Oprah Winfrey, tennis player Serena Williams were not spared. Buolamwini was baffled when on wearing a white theatre mask she cleared recognition. With Timnit Gebru of Microsoft Research, she put together a paper on the flaws of algorithms. Her organisation Algorithmic Justice League’s efforts have now led cities like San Francisco, Oakland, Somerville to pass laws to ban government use of any controversial facial-recognition technology. Last month, the Massachusetts Senate approved a bill to halt law enforcement’s use of facial recognition.

Buolamwini’s journey fascinated New York-based Indian-origin filmmaker Shalini Kantayya enough to make the docu-feature Coded Bias (2020), which premiered at this year’s Sundance Film Festival. Amid other issues, it shows the American multinational tech company Amazon deploying an algorithm for hiring, thinking it to be fair. Only, it wasn’t. “If your résumé had a women’s college or sport, you were denied admission by that AI. It’s a perfect example of how a company with the best intentions could still do unintended harm, even roll back decades of civil rights that women have fought for because an algorithm replicated historic inequalities,” says Kantayya.

Shalini Kantayya in a still from Coded Bias

Artificial Intelligence (AI), she says, when coined in the 1950s, was the domain of white men. “I stumbled down the rabbit hole and discovered this dark side to technologies we use daily, on which society’s future depends,” says Kantayya, 38, who found that mostly women or people of colour were leading the research in AI and algorithmic bias. “Astute scientists and mathematicians — seven PhD holders in the film — women, Jewish, queer, of colour — were able to see the technology from a different perspective. Inclusion isn’t just about having more women in the front in a picture,” she says. AI, however, manifests in opaque ways, beyond facial recognition, adds Kantayya, who learnt something terrifying from Cathy O’Neil’s book Weapons of Math Destruction (2016) . “The automated systems are deciding who gets healthcare, gets hired, into college, a longer prison sentence, and none of this is being vetted for bias or accuracy,” she says, adding, science as an objective field of reference is becoming an outdated idea.

In the wake of the George Floyd incident, Coded Bias has acquired greater significance. It shows a 14-year-old African-American boy confused and teary-eyed on being told by the UK monitor Big Brother Watch that the police wrongly detained him owing to faulty facial-recognition software. “Our data rights are not some privileged idea of privacy where if you’ve nothing to hide you’ve nothing to fear. Data rights are the unfinished work of civil-rights movement and integral to every right we enjoy in free democracies. Until we have comprehensive legislation around these rights, they pose a real danger,” says Kantayya, adding that she’s felt underestimated in the American film industry as a “brown-skinned” Indian filmmaker, and trusts her judgement about the invisible “racism moments” since.

Born to immigrant Indian parents in Connecticut, she studied human rights and cinema, and edited videos for singers Sting, Mariah Carey and Phil Collins, before becoming an “activist filmmaker”. Her Nandita Das-starrer short film A Drop of Life (2007) won the Best Short Film at the 2008 Palm Beach International Film Festival, the US. Since many Indians work in global tech companies, Kantayya wants them to understand the unintended consequences of the code they’re programming. “Democracies,” she says, “are built on citizens’ access to reliable information and companies like

Facebook haven’t made a big enough commitment to fight misinformation. Our democracies struggle with predatory targeting. We must question everything we see and hear. Google isn’t an encyclopedia, a source of journalism, it’s an advertising platform.” Kantayya speaks of how, triggered by COVID-19, tech companies are racing to create apps to track users, which results in data and privacy breach. “Computers need to be questioned. We need to know who is it benefitting and who is it hurting,” she says.

Tags:
  • Eye 2020
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
Express PremiumFrom kings and landlords to communities and corporates: The changing face of Durga Puja
X