Weren’t algorithms supposed to make digital mortgages colorblind?

When banks and fintechs offer mortgages online using automated underwriting, they charge creditworthy minorities more than white applicants — the same way human loan officers do.

That's the conclusion of a study conducted by professors at the University of California, Berkeley.

As more online lenders and banks automate their lending — using mathematical models to make loan decisions rather than officers — the question of whether those algorithms can be unbiased or potentially introduce new and unintended forms of discrimination is a critical one.

This Berkeley study offers a new answer: So far, lending algorithms in digital mortgages are biased in exactly the same way humans are, possibly because developers build the logic human lenders use into their software.

The report, "Consumer-Lending Discrimination in the Era of FinTech," concluded that lending officers and software-based underwriting engines both charge Latino and African-American loan applicants interest rates that are 6 to 9 basis points higher than white applicants who have the same FICO score and loan-to-value ratio. The higher interest rate was the same, whether it was a loan officer, a bank’s online lending arm or a fintech mortgage lender like Quicken or SoFi.

Interest rates to minority borrowers versus whites on digital mortgages according to study by University of California, Berkeley

Bank of America, which began offering a digital mortgage in April, declined to comment. Quicken and SoFi did not respond to requests for comment.

Some online lenders, such as Upstart (which does not offer mortgages), have said their algorithms help reduce the cost of credit and give more people offers at better pricing than traditional lenders. Upstart uses “alternative” data about education, occupation and even loan application variables in its underwriting models. (For instance, people who ask for round numbers like $20,000 are a higher risk than people who ask for odder numbers like $19,900.)

“A lot of variables that tend to be correlated with speed or lack of prudence are highly correlated with default,” Upstart co-founder Paul Gu said in a recent interview. “And indications that someone desperately needs the money right away will be correlated with defaults.”

Such factors are less discriminatory than relying on FICO scores, which correlate to income and race, according to online lender.

But in the mortgage area, it appears that bank and fintech lenders are baking traditional methods of underwriting into their digital channels.

Robert Bartlett, law professor and faculty director for the Berkeley Center for Law, Business and the Economy, said the findings were counterintuitive at first.

“Most models of discrimination focus on the human element and in-group and out-group bias,” he said, referring to bias toward people with whom we have things in common versus against those with whom we have little in common. “We found discrimination in face-to-face lending. Presumably that was because human lenders were seeing the identity of the applicant and engaging in in-group or out-group bias. When we take away that human element, presumably that should diminish the level of discrimination.”

Upon reflection, however, he realized the findings should not be surprising. Bartlett and the other researchers reason that mortgage brokers are given rate sheets by their wholesale lenders that provide for a rock-bottom price. Every point above that puts yield spread premium in their pocket.

“The incentive of brokers is always to put you in a more expensive loan, because that’s where they make a lot of their margin,” Bartlett said.

Loan underwriting algorithms are designed to try to replicate the human model in a way that’s faster and more reliable.

“If that’s the way human lenders make their profit, then you can imagine someone writing an algorithm that tries to put people in more expensive loans,” Bartlett said.

Bartlett and his team speculate that the online lenders collect data on things like geography or level of education, which correlate to consumers’ tendency to shop around for a better rate. Then they can offer higher-priced loans to the least price-sensitive people.

“We know that there are people who shop around and people who don’t,” Bartlett said. “If you’re a mortgage broker, you’re going to behave differently when those two people walk in the door.”

The results suggest that underwriting model developers are building into the algorithmic lending models the logic loan officers use to drive a loan rate higher.

“The dynamic we know happens with rate sheets and discriminatory pricing has been transported into the electronic space,” Bartlett said. “That’s how we see the data and that’s how we understand the data. It’s hard for us to know for sure because we don’t have access to the algorithms themselves.”

To be sure, other factors could be involved in a loan decision, including age, educational background, geography, jobs, income, prospective income growth, and wealth.

But in the study, Bartlett’s team looked only at mortgages that were bought by Fannie Mae and Freddie Mac. The two government-sponsored enterprises look at only two credit variables: loan-to-value ratio and FICO score.

“Any differences in rates between people who have the same LTV and FICO combination can only be explained by something that has nothing to do with credit risk as dictated by the GSEs,” Bartlett said.

The issue is critical because "it’s so easy to imagine how algorithmic lending could make the problems of discrimination worse," Bartlett said.

The study also hit a positive note: Algorithmic lending does not appear to discriminate when it comes to approving loans. Indeed, the study found face-to-face lenders reject Latino and African-American applicants 4% more often than online lenders.

“Fintechs seem to do the opposite of discrimination, catering to those discriminated against by face-to-face borrowers,” the report says.

It also found that discrimination declined for all lenders from 2008 to 2015 due to the advent of fintech startups and the speed at which consumers can shop around at algorithm-based lenders.

But Bartlett cautions that lenders need to be careful about the data they pull into their automated loan underwriting.

“Why not use the high school you attended and the major you’ve chosen as a way to come up with big data set and figure out exactly how those correlate with default?” Bartlett said. “But the problem is they’re going to correlate with race and ethnicity even beyond what they do in terms of predicting for default rates. Just because a variable predicts defaults doesn’t mean we should use it, because it could have a disparate impact.”

Editor at Large Penny Crosman welcomes feedback at penny.crosman@sourcemedia.com.

For reprint and licensing requests for this article, click here.
Digital mortgages Online banking Racial bias Marketplace lending
MORE FROM AMERICAN BANKER