2 min read

A Lawsuit: AI Landlords Discriminate Against Low-Income Tenants

An illustration of a robot landlord standing in front of a building with a "For Rent" sign.

In recent years, the use of artificial intelligence (AI) in various sectors has raised significant concerns regarding fairness and discrimination. A notable case involves SafeRent Solutions, a tenant screening service, which faced allegations of discrimination against low-income tenants, particularly those using housing vouchers. Let’s take a deeper look at the story behind this case and the broader impact of AI on housing discrimination law.

Background of the Case

SafeRent Solutions, formerly known as CoreLogic Rental Property Solutions, utilized an AI-powered screening tool to evaluate potential tenants. The tool, known as the "SafeRent Score," was designed to assist landlords in making informed decisions about tenant applications. However, a class action lawsuit filed in Massachusetts alleged that the algorithm disproportionately harmed Black and Hispanic applicants who relied on federally funded housing choice vouchers.

The lawsuit claimed that the algorithm's reliance on credit history and non-rental-related debts led to biased outcomes, violating both Massachusetts law and the Fair Housing Act, which prohibits discrimination based on race and income.

Settlement Details

This post is for paying subscribers only