Watch FairPlay's latest on demand webinar: How to Choose a Less Discriminatory Alternative Watch Now!

Glossary
Overview

All the terms you need to know when you’re trying to understand fairness, compliance, and regulation in the financial sector.

Adverse Action Reason

An adverse action reason code is a specific code provided by lenders, creditors, or other entities to explain the primary reasons for taking an adverse action against a consumer. Adverse actions can include denying credit, employment, insurance, or rental applications, offering less favorable terms than requested, or reducing credit limits.

These reason codes are required under the Fair Credit Reporting Act (FCRA) and the Equal Credit Opportunity Act (ECOA) to ensure transparency and protect consumers from unfair treatment. When an adverse action is taken based on information from a credit report or any other source, the lender or creditor must provide the consumer with an adverse action notice. This notice should include the adverse action reason codes, which help the consumer understand why they were denied or received less favorable terms.

Reason codes can vary depending on the type of credit, but common examples include:

  • Insufficient credit history: The consumer has not established enough credit to meet the lender’s requirements.
  • High credit utilization: The consumer’s outstanding debt is too high compared to their credit limits.
  • Delinquent past or present credit obligations: The consumer has a history of late or missed payments.
  • Employment history: The consumer has an unstable employment history or insufficient income.

By providing adverse action reason codes, lenders and creditors offer consumers an opportunity to understand and address the factors negatively affecting their credit applications, ultimately helping them improve their financial standing and creditworthiness.

Adverse Impact Ratio

aka “Disparate Impact Ratio”, the “AIR” is a measure of fairness used by regulators and courts to assess how often protected groups receive a positive outcome, e.g. approval for a loan, compared to a control group. Although there are no concrete fairness thresholds, many industry participants consider an  AIR of over 90% to be satisfactory, anything from 80-90% to be concerning, potentially triggering regulatory scrutiny, and anything less than 80% to be practically significant disparity resulting in elevated regulatory risk. 

BISG

BISG and BIFSG are methods used to estimate a person’s race and ethnicity based on their name and where they live. BISG stands for Bayesian Improved Surname Geocoding, which uses a person’s last name and their location to make an educated guess about their demographic background. BIFSG, or Bayesian Improved First and Surname Geocoding, takes it a step further by considering both the first and last names along with the location. These techniques are helpful for financial institutions and other organizations to get a better understanding of the people they serve, without directly asking them for sensitive information about their race or ethnicity.

Community Reinvestment Act

 The CRA was passed by Congress in 1977 to encourage regulated financial institutions to help meet the credit needs of the communities in which they operate, particularly low- and moderate-income (LMI) individuals and areas. The CRA requires that financial institutions make credit available to LMI individuals and areas, small businesses, and small farms. 

Disparate Impact

A type of discrimination that occurs when a policy, practice, or procedure that appears to be neutral on its face has a disproportionate negative impact on a particular group of people based on their protected characteristic, such as race, gender, or national origin. 

For example, a policy that requires all job applicants to have a high school diploma may seem neutral on its face, but it could have a disparate impact on certain groups of people, such as those who were not able to complete high school due to economic or social barriers.

In the context of lending and credit, disparate impact could occur if a lender has a policy or practice that results in certain groups of people being disproportionately denied credit, charged higher interest rates, or given less favorable loan terms. Even if the lender did not intend to discriminate against these groups of people, the policy or practice could still be considered discriminatory if it has a disparate impact.

ECOA

“Equal Credit Opportunity Act” – a federal law that prohibits lenders and other creditors from discriminating against applicants based on certain protected characteristics, such as race, color, religion, national origin, sex, marital status, age, or receipt of income from public assistance programs.

The ECOA applies to all types of credit, including credit cards, mortgages, car loans, and other types of loans. It requires lenders to consider an applicant’s creditworthiness based on their financial history and ability to repay the loan, rather than on their personal characteristics or other non-financial factors.

Under the ECOA, lenders and other creditors are required to provide applicants with a notice of their rights under the law. The law also requires lenders to notify applicants in writing if their credit application is denied, and to provide specific reasons for the denial (Adverse Action Reasons).

The ECOA is enforced by several government agencies, including the Consumer Financial Protection Bureau (CFPB) and the Federal Trade Commission (FTC). 

Fair Credit Reporting Act

The Fair Credit Reporting Act (FCRA) is a U.S. law that ensures the accuracy, fairness, and privacy of the information collected by credit reporting agencies, which are companies that gather and maintain information about your credit history. This law aims to protect consumers by regulating how these agencies collect, store, and share your credit information.

In simple terms, the FCRA gives you certain rights when it comes to your credit report:

  • Access to your credit report: You have the right to request a free copy of your credit report from each of the three major credit reporting agencies (Equifax, Experian, and TransUnion) once every 12 months.
  • Accuracy of information: If you find any incorrect or outdated information in your credit report, you have the right to dispute it. The credit reporting agency must investigate and correct any errors within a specific time frame.
  • Privacy protection: Your credit information can only be accessed by parties with a legitimate reason, such as lenders, landlords, or employers, and only with your consent in some cases, like for employment purposes.
  • Notification of negative information: If a lender or creditor plans to take negative action against you, like denying your credit application, they must inform you of the reasons and provide you with the name and contact information of the credit reporting agency that supplied the information.
  • Limited time for negative information: Negative information on your credit report, such as late payments or bankruptcies, can only remain on your report for a certain period, usually seven to ten years, after which it must be removed.

Fair Lending

Fair lending refers to the practice of providing equal access to credit and financial services to all individuals, regardless of their race, ethnicity, gender, national origin, religion, age, or other protected characteristic. Fair lending laws and regulations are designed to ensure that all individuals and communities have equal access to credit, and that lenders and other financial institutions do not engage in discriminatory practices.

Fair lending laws and regulations prohibit lenders and other creditors from engaging in discrimination or practices that have a disparate impact on protected groups. These laws require lenders to assess creditworthiness based on objective criteria, such as an individual’s credit history and ability to repay a loan, rather than on subjective factors such as race or national origin.

Some of the key federal fair lending laws include the Equal Credit Opportunity Act (ECOA), the Fair Housing Act (FHA), and the Home Mortgage Disclosure Act (HMDA). These laws prohibit discrimination in all types of credit transactions, including mortgages, auto loans, credit cards, and personal loans.

Enforcement of fair lending laws is carried out by various government agencies, including the Consumer Financial Protection Bureau (CFPB), the Department of Justice (DOJ), and the Federal Trade Commission (FTC). Many states have their own fair lending laws and regulations. These laws may provide additional protections for consumers and may be enforced by state agencies or through private lawsuits.

HMDA

“Home Mortgage Disclosure Act” – a federal law that requires financial institutions to collect and report data about their mortgage lending activity to help identify possible discriminatory lending patterns and ensure that all individuals and communities have access to fair and equal credit opportunities.

Under HMDA, financial institutions are required to collect and report information about the race, ethnicity, gender, and income of mortgage applicants, as well as information about the type of mortgage product they applied for, the loan amount, and the outcome of the application.

This information is then compiled and made available to the public which can be used to identify disparities in mortgage lending and to promote fair lending practices. Additionally, the data collected under HMDA is used by government agencies such as the Consumer Financial Protection Bureau (CFPB) to enforce fair lending laws and regulations.

Loss function

A mathematical function that is used to measure the difference or error between a predicted value and the actual or observed value. In machine learning, a loss function is typically used to train models and optimize their parameters by minimizing the difference between predicted and actual values.

The loss function quantifies the error between the predicted output of a model and the actual output. It takes as input the predicted output of the model and the actual output, and returns a scalar value that represents the error. The objective of the model is to minimize this error, which is done by adjusting the model’s parameters during the training process.

Redlining

Redlining is a discriminatory practice in which financial institutions, such as banks and insurance companies, systematically deny or limit access to loans, mortgages, or other financial services to people living in specific geographical areas, often based on the racial or ethnic composition of those neighborhoods. The term “redlining” originated from the practice of drawing red lines on maps around areas deemed as “high risk” or “undesirable” by financial institutions, effectively excluding residents of these areas from equal access to credit, housing, and other financial opportunities.

Regressions

When fair lending practitioners use the term “regressions,” they are typically referring to statistical techniques used to understand the relationship between different variables, such as borrower characteristics and loan approval or denial. For years, regression models helped fair lending practitioners identify which variables have a significant impact on outcomes for protected groups. But newer forms of explanatory math, like Shapley values, are gaining popularity as they offer a more comprehensive and nuanced understanding of the variables that drive disparities for protected groups.

Shapley Values

Shapley Values, which originate from cooperative game theory, are used to identify the contributions of individual variables to the overall prediction made by a predictive model. Shapley Values provide an interpretable way of explaining complex models, such as decision trees and neural networks, which are usually harder to understand compared to traditional regression models. Shapley values are considered the state of the art in computing drivers of disparity because they provide consistent explanations for predictions across different instances in the dataset, take into account the interaction effects between variables, and can be applied to a wide range of predictive models, not just linear regression, making them more versatile in addressing different types of predictive tasks.

SPCP

Special Purpose Credit Program” A Special Purpose Credit Program is a type of lending program that is designed to meet the specific needs of a particular group or sector of borrowers. These programs typically provide loans to borrowers who may not qualify for traditional financing, such as individuals with low credit scores, small businesses, or agricultural producers.

SPCPs often offer favorable terms and conditions, such as lower interest rates, longer repayment periods, or reduced collateral requirements, to make credit more accessible to a targeted segment of borrowers. The goal of these programs is to promote economic development, create jobs, and support underserved communities, e.g. women-owned businesses, minority-owned businesses, or veterans.