Lesson 370 of 1570
When AI Decides Who Gets Housing
Landlords increasingly use AI tenant-screening tools that pull court records, eviction history, and credit.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1When AI Decides Who Gets Housing
- 2tenant screening
- 3SafeRent
- 4disparate impact
Concept cluster
Terms to connect while reading
Section 1
When AI Decides Who Gets Housing
Landlords increasingly use AI tenant-screening tools that pull court records, eviction history, and credit. The decisions can be unfair.
SafeRent and others have been sued for refusing tenants based on biased algorithms. A 2023 settlement required SafeRent to change its scoring.
Three issues
- Eviction filings (even ones dropped) hurt scores
- Past records follow people for years
- Hard to challenge incorrect data
Key terms in this lesson
The big idea: AI in housing decisions can lock people out unfairly. Tenant rights laws are catching up.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “When AI Decides Who Gets Housing”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 18 min
AI Bias That Hurt Real People
AI bias isn't just a theory.
Builders · 18 min
When AI Predicts Child Welfare Risk
Some states use AI to predict which families need child protective services attention.
Adults & Professionals · 11 min
AI in Housing Decisions: Fair Housing Act Compliance
AI in tenant screening, mortgage decisioning, and rental pricing faces strict Fair Housing Act compliance. Disparate-impact tests are the standard.
