This article originally ran in American Banker
Octane Lending, an online lender based in New York, has a challenge when it comes to loan decisions. The company helps people buy power sports vehicles, such as motorcycles and all-terrain vehicles. Such loans tend to get reported as auto loans or secured customer loans, not specifically as motorcycle loans, so finding comparable records is difficult.
So the company has built its own, AI-based credit score and underwriting model. It also uses the FICO Auto 9 score.
Recently, to confirm that its credit models don’t inadvertently reflect bias or have a disparate impact on disadvantaged communities, the $1.5 billion-asset Octane began deploying fairness testing software. Having come from a large British bank, Chief Risk Officer Ray Duggins is tuned into the need for fair lending and anti-discrimination efforts, which are closely regulated in Europe. He was formerly chief risk officer at GE Capital and at Standard Chartered Bank’s consumer bank in Singapore.
“I’ve never built a model where I intended to discriminate against anyone,” Duggins said. “But you always have to go back and test to make sure you’re not doing something inadvertently.”
Octane is not alone. Its fairness vendor, Los Angeles-based FairPlay, developer of what it calls “fairness-as-a-service” for AI-based loan software, says 10 financial services customers, including two large banks, are using its software.
FairPlay this week raised $10 million in a Series A round led by Nyca Partners, with participation from Cross River Digital Ventures, Third Prime, Fin Capital, TTV, Nevcaut Ventures, Financial Venture Studio and Jonathan Weiner, a venture partner at Oak HC/FT. This follows FairPlay’s $4.5 million seed round in November.