Tenant Screening Algorithms Disproportionately Reject Black and Latino Renters

housing+20 views
Automated tenant screening systems combine credit scores, eviction records, and criminal background data into a single accept/reject recommendation for landlords. These algorithms routinely return incorrect, outdated, or misleading information — and because Black and Latino Americans are disproportionately arrested (due to documented policing disparities) and disproportionately named in eviction filings (due to income inequality and housing instability), the errors compound along racial lines. An eviction filing that was dismissed in the tenant's favor still shows up as a mark against them. An arrest that never led to a conviction is treated as equivalent to a guilty verdict. The landlord sees a red score and rejects the application without ever examining the underlying data. The applicant loses the apartment and likely the application fee (typically $30-75, non-refundable). They apply to the next listing and are rejected again by the same algorithm pulling from the same flawed database. In tight rental markets, this creates a cycle where people with common names, prior contact with the justice system, or previous landlord disputes are effectively locked out of housing. This persists because tenant screening companies are largely unregulated compared to credit bureaus. HUD issued guidance in May 2024 on the Fair Housing Act's application to tenant screening, but guidance is not enforcement. Landlords have no incentive to look beyond the algorithm — doing so costs time and creates legal exposure if they override a 'reject' recommendation and the tenant later causes problems. In November 2024, two lawsuits were filed against private-equity-backed landlords for relying on screening systems with inaccurate eviction and criminal data.

Evidence

Georgetown Law's Poverty Journal documents how AI-powered tenant screening programs have high error rates and use overbroad criminal background categories (https://www.law.georgetown.edu/poverty-journal/blog/the-discriminatory-impacts-of-ai-powered-tenant-screening-programs/). CDT found that screening algorithms enable racial and disability discrimination at scale (https://cdt.org/insights/tenant-screening-algorithms-enable-racial-and-disability-discrimination-at-scale-and-contribute-to-broader-patterns-of-injustice/). HUD released Fair Housing Act guidance on tenant screening in May 2024 (https://www.nclc.org/hud-takes-aim-at-discriminatory-practices-by-tenant-screening-companies-and-housing-providers/). NBC News reported on November 2024 lawsuits against PE-backed landlords for discriminatory screening (https://www.nbcnews.com/news/us-news/private-equity-landlords-screening-process-discriminated-renters-lawsu-rcna180707).

Comments