Is It Time To Start Using Race And Gender To Combat Bias In Lending?

August 10 , 2022

A woman, let’s call her Lisa, applies for a loan. She’s 35 with a graduate degree, a high earning trajectory and a 670 credit score. She also just returned to work after taking time off to start a family.

Her application goes to an algorithm, which assesses her risk profile to determine whether she should be approved. The algorithm sees her recent gap in employment and labels her a “risky” borrower. The result? Her application is rejected.

Examples like this happen every day in lending. Are these decisions fair?

When it comes to fairness in lending, a cardinal rule is, “Thou shalt not use variables like race, gender or age when deciding whether to approve someone for a loan.”

This rule dates back to the Equal Credit Opportunity Act (ECOA), passed in 1974 to stop lenders from deliberately denying loans to Black applicants and segregating neighborhoods—a practice called redlining. The problem got so bad, the government had to ban the consideration of race or gender when making loan approval or other high-stakes decisions.

The assumption behind ECOA was that if decision makers—be they humans or machines—are unaware of attributes like race or gender at decision-time, then the actions they take will be based on “neutral” and “objective” factors that are fair.

There’s just one problem with this assumption: It’s wishful thinking to assume that keeping algorithms blind to protected characteristics means the algorithms won’t discriminate.

Top