Tech startup FairPlay aims to root out lending bias with AI

As regulatory scrutiny of banks’ lending practices intensifies, one startup is promoting artificial intelligence software aimed at helping banks remove biases built into their automated lending systems.

Kareem Saleh, CEO of FairPlay, says his work over the years with banks on fair lending had left him disillusioned — and motivated him to find a way to undo bad practices. 

Kareem Saleh, co-founder and CEO, FairPlay
As more lending decisions are made by algorithms, consumers — especially millennials and Generation Z — are going to demand that those decisions be made fairly, says Kareem Saleh, CEO and co-founder of FairPlay.

“Most of the people employed in fair lending compliance don't actually make the lending practices of their institutions any fairer,” said Saleh, a former executive vice president at credit underwriting software company Zest AI. “They just try to come up with clever justifications for unfairness.”

On Monday, New York-based FairPlay — which Saleh co-founded with another Zest alumnus, John Merrill — announced it has raised $4.5 million in seed funding from Third Prime Capital, with participation from FinVC, TTV, Financial Venture Studio, Amara and Nevcaut Ventures.

The investment comes near the of a year in which regulators in the new Biden administration have spotlighted loan discrimination issues and banks have sought to respond to persistent calls nationwide for greater inclusion and racial equity.

Last week, the Federal Reserve Bank of Philadelphia issued a report that said men on average receive credit card limits that are more than $1,323 higher than women with comparable income and credit scores. In October, the Justice Department announced the launch of a Combatting Redlining Initiative, saying it “represents the department’s most aggressive and coordinated enforcement effort to address redlining.” That same day, several government agencies announced a $9 million settlement with the $17 billion-asset Trustmark National Bank in Jackson, Mississippi, over allegations it redlined predominantly Black and Hispanic neighborhoods in Memphis, Tennessee. The Justice Department announced a similar settlement with the $18.7 billion-asset Cadence Bancorp in August.

FairPlay is starting off with two fair lending software tools, packaged as application programming interfaces banks can use with existing lending programs. The first, Fairness Analysis, analyzes a bank’s existing loan software for signs of discrimination. It tries to answer five questions: Is my algorithm fair? If not, why not? How could it be fairer? What's the economic impact to the business of being fair? And, do applicants who are rejected get a second look to see if they might resemble favored borrowers?

“That allows us to automate that fair lending analysis in ways that make bias detection fast, easy to understand and affordable,” Saleh said. “When we find disparities, we can actually do something to remediate them.”

The second API, Second Look, reevaluates declined loan applications using more complete information about borrowers in protected classes and different modeling techniques to see if they resemble creditworthy borrowers.

“When we do this, we find that the highest-scoring Black, brown and female borrowers that get declined perform as well as the lowest-scoring white males that get approved,” Saleh said.

The software pulls in census data and the proxy methodology that the Consumer Financial Protection Bureau uses to assess whether lenders are in compliance with fair-lending laws. Those techniques help recognize borrowers in protected classes, even where a bank is not allowed to collect information about race, age and other factors.

The software has been trained using historical data from other lenders to detect behaviors of underrepresented classes that might look different from the borrowers a bank has lent to in the past. For example, the algorithm has been taught that women may have inconsistent employment between the ages of 25 and 45. Though this would be a creditworthiness red flag for male borrowers, it isn’t necessarily for women, who are more likely to take breaks to raise children.

“The result is that lenders can increase their approval rates of Black, brown and female applicants, sometimes on the order of 10% to 30%,” Saleh said. This can lead to better compliance with fair-lending rules and increased overall profitability, he said.

The software could help banks work toward overcoming disparate impact, said Venkat Varadachary, former chief data officer at American Express and current CEO of the AI automation firm Zenon.

“It can help a lot of midsized and smaller banks do things that they're just technically not able to do,” he said, because they have older software and lack the manpower to analyze every loan for fairness.

Fair-lending compliance software has been around for some time. Wolters Kluwer has had Fair Lending Wiz, which analyzes loan and application data to help identify potential disparities that could indicate discrimination for more than two decades. Matt Marek, senior director of technology product management for Wolters Kluwer Compliance Solutions, said he has seen an uptick of interest this year.

“We definitely have a lot of inquiries,” he said. “We're having a lot of conversations.”

The key difference between FairPlay and its competitors, according to Saleh, is its ability to re-underwrite declined loans using information about people in the protected classes of race, gender, national origin, marital status, military affiliation and disability who are underrepresented in credit bureau reports and scores.

Given the way FairPlay routes declined applications to a second-look model, banks don’t have to rip and replace existing loan decision software, he said. And FairPlay can be used by nontechnical people in areas like legal and compliance, he said.

As more lending decisions are made by algorithms, consumers — especially millennials and Generation Z — are going to demand that those decisions be made fairly, Saleh said. Regulators will demand the same, he said.

Figure, an online lender based in San Francisco that offers home equity lines of credit, mortgage refinancing, unsecured personal loans and a buy now/pay later credit product, uses FairPlay to monitor and test its unsecured personal loans for fair-lending compliance.

When he was a bank examiner, Chief Compliance Officer Rory Birmingham said he saw “how expensive and sometimes not useful the big consulting agencies can be for fair lending. It just turns into a mess of an engagement and takes forever.” He liked the idea of being able to give his internal team software to monitor and test loans.

Machine-learning models can be hard to test, Birmingham said, because they are constantly learning and taking new factors into consideration.

“It's really hard to do without the right technology,” he said. “FairPlay was built for this new wave of underwriting and pricing models,” Birmingham said. “It's using the same kind of AI technology that our [loan decision] model is using.”

After decades of awareness of loan discrimination, one would think that by now, lenders would be on top of this problem.

Varadachary pointed out that even lending-decision models that have been carefully vetted for bias can have disparate impact for multiple reasons, including the limited data available for people with a thin credit report.

“If you only underwrite certain populations, when you see a new population, you're not very good at underwriting the new population,” Varadachary said. “Historically, people from disadvantaged backgrounds have been denied credit and there's not enough data or information on those people to grant them credit. The entire system is rigged against them.”

“Where is the bias in the system? It's everywhere,” Varadachary said.

For reprint and licensing requests for this article, click here.
Bank technology Artificial intelligence
MORE FROM AMERICAN BANKER