AI technologies are increasingly used for surveillance, tracking movement, and gathering biometric data. Although transit agencies, police, and self-driving car companies build these technologies to improve safety and ease, they create more risk for people if the technology doesn’t work well for them. Using AI in transportation can also threaten our freedom to move around on roads and in the air.
They’re taking information that I didn’t realize was going to be shared and screwing with our insurance
ProPublica
If you believe you’ve been harmed by Artificial Intelligence, please fill out our harms report form. We will get back to you within 48 hours on weekdays and 72 over the weekend.
If you are seeking legal advice or representation, consider reaching out to an ACLU office in your respective state.
The #FreedomFlyers campaign raises awareness about the TSA's expanding use of facial recognition at airports. Make your voice heard by filling out a TSA Scorecard.
In Design Justice, Dr. Sasha Costanza-Chock, Senior Research Advisor to AJL, talks about designing technology that works for everyone. She touches on how airport security scanners are biased against transgender people, because they are based on outdated social ideas.
In Predictive Inequity in Object Detection, Georgia Tech researchers showed that self-driving cars' systems work differently depending on pedestrians’ skin type.
In Bias Behind the Wheel, researchers from China, London, and Singapore analyzed how self-driving cars sometimes have trouble detecting pedestrians based on things like age and gender.
In 2024, Alabama passed a law to regulate the use of self-driving cars, requiring companies and drivers to register vehicles with automated driving systems.
The EFF has compiled a list of resources that allows individuals to figure out what data their cars are tracking and how to opt-out of sharing if possible.
This project helps people track the privacy practices of car manufacturers, providing ratings of car companies and their privacy habits to help consumers make informed choices.
The Electronic Frontier Foundation (EFF) studied eight days of Automated License Plate Reader (ALPR) data to show how ALPRs work and push for more police accountability.
Stay up to date with the movement towards equitable and accountable AI.
SIGN UP