One of the biggest challenges we face as an algorithmic fairness company is the lack of reliable data on some of society’s most vulnerable groups—particularly people with disabilities.
The Fair Housing Act prohibits discrimination on the basis of disability, and both the FHA and the Equal Credit Opportunity Act prohibit discrimination based on characteristics such as race, sex, and national origin. Yet, while financial institutions are required to avoid such discrimination, they often lack the necessary data to test for biases impacting these groups. For race, sex, and national origin, we have methods for imputing demographic data, but for characteristics like disability and religious affiliation, accurate data is scarce.
A recent report by Ariana Aboulafia, Miranda Bogen and Bonnielin Swenor at the Center for Democracy & Technology highlights the need for better data on disabled populations—not only to ensure fairness in AI systems but to inform the development of disability-inclusive policies, allocate public funding, and uphold civil rights.
Collecting data on people with disabilities is especially challenging. Definitions of disability vary widely, and social stigmas can deter individuals from disclosing their disability status. Additionally, disability data collection tools themselves can be inaccessible—for example, surveys that are incompatible with screen readers exclude people who are blind.
Lenders themselves rarely collect data on disability status, for an array of reasons ranging from privacy and reputational concerns to requirements under fair housing and credit reporting laws. While some lending contexts allow for voluntary disclosure of disability status, the risk of regulatory scrutiny or criticism and general reputational harm makes collection of protected status information a rare practice.
In the absence of direct data, collaboration with advocacy groups specializing in disability rights could be a valuable step forward. These organizations can help identify areas where bias may occur and suggest strategies for mitigating it. At the same time, regulators like the CFPB and HUD could take steps to promote practices that would better identify and address the risk of disability discrimination, such as issuing guidance on permissible information collection in this space, suggestions for how to proxy for missing information about disability status, or fostering innovation in fairness testing on disability status through no-actions letters, regulatory sandboxes, or other means.
As AI and alternative data continue to shape critical sectors like credit and housing, we must ensure that those who are often overlooked and undercounted—like people with disabilities—are not left behind. We need to push for more inclusive data practices and more comprehensive fairness assessments to avoid perpetuating disability discrimination in a rapidly evolving digital landscape.
https://shorturl.at/g553o
Fair Lending Analysis
Identify and overcome tradeoffs between performance and disparity.