Watch FairPlay's latest on demand webinar: How to Choose a Less Discriminatory Alternative
Watch Now!

Tenant Screening Software Faces Legal Reckoning Over AI Bias and Harm to Renters

Share this post

Tenant Screening Software is facing a reckoning.

A newly filed lawsuit accuses a tenant screening company of failing to implement basic AI safety controls, resulting in widespread harm to renters.

According to the complaint, the company’s algorithm flagged people for criminal records they didn’t have and misclassified applicants in ways that disproportionately led to wrongful denials of housing for low-income and minority renters. These errors exacerbate existing inequalities in the rental market.

Many of these issues could have been prevented had the company implemented algorithmic fairness testing and de-biasing techniques.

As AI increasingly powers critical decisions in our daily lives, failures like this will spark more legal challenges.

Make sure your AI is making headlines for innovation, not litigation — FairPlay can help.

Hat tip to Andrew W. Grant for bringing this case to my attention.

Contact us today to see how increasing your fairness can increase your bottom line.