The use of artificial intelligence in housing, sometimes called "Landlord Tech," may look like illegal surveillance of tenants, discriminatory rental denials, or automated rent increases. These systems may encode biases and give landlords an unfair advantage that makes it harder for everyone to access suitable housing.
The risks of this technology affect both renters and those looking to own homes. Constant surveillance threatens residents' privacy and autonomy. Additionally, errors in biometric systems and automated property valuation models can worsen existing inequalities in the housing market.
These tools 'are not foolproof,' and their mistakes can adversely impact public housing residents… this is the type of technology that the department is cautioning against
The Washington Post
If you believe you’ve been harmed by Artificial Intelligence, please fill out our harms report form. We will get back to you within 48 hours on weekdays and 72 over the weekend.
If you are seeking legal advice or representation, consider reaching out to an ACLU office in your respective state.
The Coded Bias documentary, released in 2020, tells the story of Dr. Joy Buolamwini’s discovery that facial recognition does not see dark skinned faces accurately. The film highlights the story of a building management company in Brooklyn that planned to implement facial recognition technology to allow tenants to enter their homes.
In 2019, algorithmic bias researchers, including Dr. Joy Boulamwini, Dr. Timnit Gebru, and Inioluwa Deborah Raji, submitted an amicus support letter in support of the Brooklyn Tenants who were pushing back against the use of facial recognition technology in their building.
In 2023, the National Low Income Housing Commission (NLIHC) submitted a report on unjust automated screening processes to the Consumer Financial Protection Bureau and Federal Trade Commission. The report outlines the various ways that these systems discriminate particularly against low-income renters.
The Countering Tenant Screening Initiative collects tenant screening reports to hold tenant screening algorithms accountable and to teach individuals about how tenant screening works.
The National Fair Housing Alliance published the Method for Improving Mortgage Fairness report on how to improve mortgage fairness by underwriting data models through methods like Distribution Matching.
Stay up to date with the movement towards equitable and accountable AI.
SIGN UP