When MIT researcher, poet and computer scientist Joy Buolamwini uncovers racial and gender bias in AI systems sold by big tech companies, she embarks on a journey alongside pioneering women sounding the alarm about the dangers of unchecked artificial intelligence that impacts us all. Through Joy’s transformation from scientist to steadfast advocate and the stories of everyday people experiencing technical harms, Coded Bias sheds light on the threats A.I. poses to civil rights and democracy.
View screening details here.
Coded Bias weaves the personal stories of people whose lives have been directly impacted by unjust algorithms. You can make an impact by helping us spread the word about the film, hosting a screening, and/or entering your email below to join the movement towards equitable and accountable AI.
Facial recognition tools sold by large tech companies including IBM, Microsoft, and Amazon, are racially and gender biased. They have even failed to correctly classify the faces of icons like Oprah Winfrey, Michelle Obama, and Serena Williams. Around the world, these tools are being deployed raising concerns of mass surveillance.
An award-winning and beloved teacher encounters injustice with an automated assessment tool, exposing the risk of relying on artificial intelligence to judge human excellence.
A building management company in Brooklyn plans to implement facial recognition software for tenants to use to enter their homes, but the community fights back.
Despite working hard to contribute to society, a returning citizen finds her efforts in jeopardy due to law enforcement risk assessment tools. The criminal justice system is already riddled with racial injustice and biased algorithms are accentuating this.
Help us shed light on the impact of AI harms on civil rights and people’s lives around the world. You can share your story using the hashtag #CodedBias or send us a private message.
share your storyAI systems are increasingly infiltrating our lives, influencing who gets a job, which students get admitted to college, how cars navigate the roads, what medical treatment an individual receives, and even who we date. And while builders of AI systems aim to overcome human limitations, research studies and headlines continue to remind us that these systems come with risks of bias and abuse. AI reflects the coded gaze- our term for the priorities, preferences, and at times prejudices of those who have the power to shape the technology.
Coded Bias illuminates our mass misconceptions about AI and emphasizes the urgent need for legislative protection, and follows the Algorithmic Justice League’s journey to push for the first-ever legislation in the U.S to place limits to facial recognition technology. Support our work by sharing our advocacy and policy initiatives, or by making a donation so we can keep going.
Technology should serve all of us. Not just the privileged few.
SIGN UPShalini Kantayya directed the season finale for the National Geographic series Breakthrough with executive producer Ron Howard. Her debut feature film, Catching the Sun, premiered at the Los Angeles Film Festival and was named a New York Times Critics’ Pick. The film released globally on Netflix on Earth Day 2016 with executive producer Leonardo DiCaprio and was nominated for the Environmental Media Association Award for Best Documentary. Kantayya is a TED fellow.
Founder of the Algorithmic Justice League
@jovialjoy
Author of Weapons of Math Destruction
@mathbabedotorg
Author of Artificial Unintelligence
@merbroussard
Author of Algorithms of Oppression
@safiyanoble