Is it OK for lending algorithms to favor Ivy League schools?

Much of the energy behind the fintech movement stems from the promise that it can provide financial services to those left out of the mainstream banking sector.

But the way some online lenders go about deciding to whom they will lend flies in the face of this premise. Their underwriting relies heavily on where the potential borrower attended college, and they tend to skew in favor of the Ivy League. Their algorithms predict the future income potential of graduates based on their alma mater — those who hail from Harvard are likely to be far richer in the future than those from, say, certain community or vocational schools.

While disparate-impact studies have yet to be conducted on this practice, how can it not have a disparate impact? The school someone goes to depends, to a large degree, on the socioeconomic strata they were born into, how much money their parents make and what neighborhood they live in. None of these factors are supposed to be considered in a lending decision.

Comptroller of the Currency Thomas Curry made this point at the LendIt conference this week. He didn’t speak directly of using education as a lending criterion, but he did stress that all fintech companies, especially those that seek a special-purpose charter from the OCC, must provide financial inclusion and not discriminate.

“Last year, we started an international conversation by offering one definition for responsible innovation,” Curry said. “Implicit in the definition are rigorous controls and governance to ensure you comply with applicable laws and regulations, provide fair access to your services, and treat your customers fairly.”

Fintech companies that apply for a national charter must include in their business plans a description of “how they will support the needs of the communities they serve and promote financial inclusion,” Curry said.

Even those fintech companies that don’t apply for a charter and are not subject to OCC supervision are bound to adhere to the same principles that govern the industry, such as the need not to have lending policies that “might exclude a high number of applicants who have lower income levels or lower home values than the rest of the applicant pool,” the OCC’s guidance on disparate impact says.

The Consumer Financial Protection Bureau is also looking into this issue. It recently issued a request for information on the benefits and risks of using alternative information in credit decisions.

“If the use and analysis of alternative data leads to certain consumers being needlessly penalized, we want to know that,” Richard Cordray, the bureau's director, said in a speech in February.

He cited newer underwriting algorithms that use measures of residential stability to predict creditworthiness that could hinder access to credit for members of the military who move frequently.

“Other data may be strongly correlated with characteristics such as race or gender, which could enable lenders to do indirectly what they are forbidden from doing directly: drawing conclusions about whether to make a loan based on a person’s race, gender or other prohibited categories,” Cordray said. “Similarly, data tied to a consumer’s place on the economic ladder may hinder those trying to climb it.”

The Ivy is green

Regulation aside, even from just a pure risk angle, evidence and opinion is building that the value of a college degree as a proof of character may be overrated.

Google’s chairman and head of hiring, Laszlo Bock, has said that Google values the skills and experiences that candidates get in college, but that a degree doesn’t tell them much about talent or grit.

“When you look at people who don’t go to school and make their way in the world, those are exceptional human beings. And we should do everything we can to find those people,” Bock told The New York Times.

Google, Apple and IBM are among the companies that no longer require job applicants to have a college degree.

And people who apprenticed as plumbers and started their own businesses, or who studied something practical at a community college, could turn out to be excellent credit risks in the long run.

Are online lenders leaving good credits on the table because of their credentialism?

At the LendIt conference, several online lenders defended algorithm-based lending, even when it includes college as a factor. They argued that although machine-based underwriting is still being perfected, it is far less biased than people.

“I think a baseline question is, how much disparate impact already exists in the system?” said Paul Gu, co-founder of the online consumer lender Upstart, which includes the potential borrower’s college in its underwriting criteria. “I think we would be kidding ourselves if we thought that the traditional way of underwriting was a completely unbiased way of underwriting. If you look at credit scores by any demographic, they're extremely uneven. If you look at credit access in America, it's extremely uneven.”

Newer algorithmic lending models could present a risk of excluding those from lesser-credentialed colleges. But better, more truthful models will over time make the process less biased, he said.

Modern credit models are moving closer toward identifying borrowers’ true level of riskiness, according to Gu.

“As long as you know everything there is to know about a person, by using better models, you're going to get to a more fair world,” Gu said.

Machine learning systems use different data points than have been used historically, said Kathryn Petralia, co-founder of Kabbage, an online lender to small businesses. She added that the data science is still developing.

“Every business lender wants to understand how long has the business been in business, they want to understand revenue performance over time and consistency over time, those things are important,” she said. “But there's a lot of other data that you can collect from, whether it's elements within a checking account or an accounting platform or their invoice management system. There are no benchmarks for that and it hasn't been used before in the way we're doing it and certainly not in real time. We don't have any underwriters, all the decisions are automated, so you have to be able to learn over time what's important and what's not.”

Human loan officers, on the other hand, are “fundamentally biased,” Petralia said. “Maybe your mom's name is Rachel, so you're like, ‘Oh a Rachel is applying, I like that name, let's give her an extra buck.’ Or, ‘The chick who just broke up with me, her name is Sara and I don't like anybody named Sara.’ It's impossible to eliminate these biases.”

But couldn’t machine learning engines find correlations on their own that could end up excluding portions of the population?

Petralia acknowledged that unintended factors could creep into algorithmic lending.

“You could have an outcome you weren't thinking about, weren't anticipating, like perhaps you make fewer loans to minorities in rural areas for some reason — maybe they don't apply at the same rate, or maybe for some reason the types of businesses they operate don't perform in the same way that urban businesses perform,” she said. “There are so many things you have to take into account to figure out, is it really a bias and what are you trying to achieve with that product you're offering?”

Petralia said that’s why it is important to continually review outcomes, including the customer mix and the effect on various groups.

By using algorithms that use which college you attended as part of an underwriting decision, some lenders might be (unintentionally?) shunning worthy borrowers. Still, lending models that rely on AI and machine learning have the potential to open up credit to folks who historically have been underserved, both consumers and businesses.

“If anything, this is a really good opportunity,” said Prashant Fuloria, chief product officer at Fundbox. “For example, how would you underwrite a recent migrant who doesn't have five years’ worth of credit history? There are startups out there who can solve this problem.”

So, while some might be use their technology to get away with being elitists, others are using similar technology to be equalizers.

Editor at Large Penny Crosman welcomes feedback at penny.crosman@sourcemedia.com.

For reprint and licensing requests for this article, click here.
Marketplace lending Artificial intelligence Bank technology
MORE FROM AMERICAN BANKER