Can Freddie Mac's embrace of AI pull the industry along?

Freddie Mac's foray into artificial intelligence may be a significant turning point in broadening the use of the technology.

Until recently, most of the companies publicly using AI in lending decisions were alternative lenders and a handful of auto lending companies. But experts say Freddie Mac's decision to test ZestFinance, an AI software company that focuses on the credit markets to test making loan decisions, could encourage other firms to follow suit.

“Freddie Mac and its sibling Fannie Mae are viewed as the gatekeepers to what is considered in and out of bounds by mortgage lenders,” said Leslie Parrish, senior analyst at Aite Group. “This partnership can serve as a significant step down the path to AI in lending."

It's also significant because it appears to have the blessing of the Federal Housing Finance Agency at a time when no federal regulator has publicly sanctioned the use of AI for this purpose.

“It's great news to hear the FHFA is moving toward support for alternative underwriting data in the mortgage market,” said Dave Girouard, CEO of the online lender Upstart. “There's definitely a growing consensus that AI and alternative data are improving access to affordable credit.”

Lenders on AI

Still, Parrish cautions that there is a long way to go before AI use becomes widespread.

"Mortgage and other consumer lenders still face challenges in not only incorporating AI techniques in their businesses, but also ensuring that they can fully explain the lending decisions that result," he said.

Updating credit models

Freddie, Fannie Mae and the FHFA are required by a bill passed last year to use more modern methods of evaluating mortgages and thereby improve consumer access to mortgage credit. The FHFA recently wrote a rule that spells out a process Fannie and Freddie should use to validate newer, more nuanced types of credit scores like FICO 9 or VantageScore 3.0.

The rule also allows the government-sponsored enterprises to run pilot programs to evaluate alternative scoring models. Though it hasn’t publicly said so (and didn’t respond to a request for an interview), the FHFA appears to have accepted Freddie’s pilot of ZestFinance’s software. But that is just the first step.

“This pilot is not the main game,” said Chi Chi Wu, attorney at the National Consumer Law Center. “The main game is still going to be whatever scoring model gets picked out of this process.”

Wu supports the idea of running pilots on alternative scores and models, she said, with the hope of expanding access to credit.

The controversies around AI in lending

Many observers say AI-based underwriting models are a promising alternative to the heavy use of FICO scores that is common today.

“Consumer advocates have repeatedly said that the touchstone for lending should be ability to repay,” Wu said. “A credit score is not a gauge of ability to repay. It doesn't include income or other assets. What does a credit score show? It shows how a consumer has handled credit in the past. If they have black marks, it could be because they're an irresponsible borrower, but it also could be that something happened in their lives — they lost their job, they got sick. I think a lot of people who have poor scores are in that situation.”

Instead of leaning on FICO scores, an AI engine can simultaneously consider many pieces of information that might include credit score but also cash flow, bill repayment and other signals of financial behavior. This could help people who haven’t had the opportunity to build up a credit history, or who have had a financial setback because of a medical problem but are otherwise financially responsible.

“We’ve been able to show that a bunch of minority borrowers are being unfairly kept out of the system,” said Kareem Saleh, executive vice president of ZestFinance. “If you take a more holistic view of a borrower, you can dramatically narrow the approval rate gap between whites and minorities.”

Minority borrowers are more than twice as likely to be denied loans than white applicants, Saleh said.

“A lot of minority borrowers who get tagged as risky by conventional underwriting methods are actually not as risky as they seem,” he said.

The case against AI in lending is mostly uncertainty: What if an AI system makes a bad decision? What if it makes decisions that violate the Fair Credit Reporting Act or the Equal Credit Opportunity Act? What if the software can’t generate a meaningful reason for a denial?

“One of the things that we've emphasized with AI and machine learning is that lenders need to be able to provide an explanation of why a credit decision was made,” Wu said. “The FCRA and the ECOA require these adverse action notices that explain why decisions are made. Usually they revolve around the score: there’s too much debt, too much credit use on a credit line, too many accounts, the accounts are too new. With machine learning and artificial intelligence, the question is how do you provide these explanations if it's the machine deciding what factors will be used and what weights will be used?”

Online lenders and AI lending software providers say they provide explainability, and that the explanations and reason codes their software produces are more accurate and useful than traditional lenders’.

Wu also worries that if AI models consider factors like college attainment (as a few online lenders’ models do), they could result in racial and economic disparities, because Hispanics and African-Americans attend and graduate college at a much lower rate than Asians and Caucasians. People who graduate college tend to come from privileged backgrounds.

“Even if you could justify it legally, is that what we want?” she said. “How do you expand access to credit for low- and moderate-income borrowers, given what we know about who goes to college and who ends up graduating with a degree?”

Upstart uses educational data in its underwriting, but it only represents a handful of more than 1,000 variables, according to Girouard, the Upstart CEO. About 55% of recent Upstart borrowers have a four-year college degree, almost none are currently students.

According to a report the Consumer Financial Protection Bureau published in August about tests it's conducted, Upstart's machine learning model for loans approves 27% more applicants than traditional models and yields 16% lower average annual percentage rates for approved loans. This expanded access to credit occurs across all tested race, ethnicity and gender segments.

More specifically, the CFPB found that near-prime consumers with FICO scores of 620 to 660 are approved approximately twice as frequently. Applicants under 25 years of age are 32% more likely to be approved. and Consumers with incomes under $50,000 are 13% more likely to be approved.

Cash flow data is a more acceptable type of alternative data to use in AI models. A study recently conducted by FinRegLab found that cash-flow data is useful in predicting creditworthiness.

Some observers have warned that using AI in lending and extending credit to lower-FICO-score consumers could lead to another crisis like the one in 2008. Wu dismisses this idea.

"What happened in 2008 and the years leading up to it had to do with the behavior of the industry, not the behavior of consumers," she said.

For reprint and licensing requests for this article, click here.
Artificial intelligence Machine learning Digital mortgages Marketplace lending Credit scores Alternative lending Freddie Mac FHFA
MORE FROM AMERICAN BANKER