How to Talk About Big Data and Lending Discrimination

Register now

Facebook recently patented technology that could help lenders measure borrowers' creditworthiness based on their social network connections. The idea is that borrowers will be more likely to pay back their loans if the people they are friends with also pay back their loans.

On one hand, this sounds like a reasonable way to evaluate creditworthiness. Richer people tend to have richer friends, who are more likely to have higher credit scores. If lenders are trying to figure out whether to extend loans to people with thin credit files, the knowledge that a person has a lot of financially stable friends may make him or her a safer bet.

On the other hand, this seems like an unfair way to distribute loans. Most of us are Facebook friends with a bunch of people from high school. If I went to high school in a poor community but now have the means to pay back my loans, this method could wrongly rule me out. (Of course, that's assuming social media ties are the determining factor in a lending decision. Lenders also typically look at information about borrowers' incomes, debt, and collateral.)

The possibility that lenders could use Facebook to evaluate borrowers also raises concerns about disparate impact, a concept that was beautifully explained in the recent article, "When Big Data Becomes Bad Data." The idea is that when a lender's process favors one group of people over another, it may be illegal — whether or not the discrimination is intentional. There's lots of precedent for this in the courts, and the Supreme Court recently upheld disparate impact as a legitimate argument in Fair Housing Act cases.

It's still not clear whether a "disparate impact" argument can be used in the case of algorithms, however. And there are plenty of people who work in the field of big data who dismiss concerns about disparate impact altogether and claim the Facebook idea is entirely legitimate. I had an argument on the Slate Money podcast last Friday about this very question.

Here's my theory as to why the issue of big data and disparate impact is so controversial. According to the neoliberal thought process, every person is told to behave rationally, act as an individual and seek maximum profit. The invisible hand is meant to be guiding all of us on a miniature scale.

If people buy into the ideology that instructs us to act as individuals and ignore group dynamics, the disparate impact argument is difficult — if not impossible — to understand. Why would anyone want to loan money to a poor person who may have trouble paying it back? That makes no economic sense. More relevantly, why would anyone opt not to distinguish between a poor person and a rich person before making a loan or any other kind of financial decision? That's the heart of how the big data movement operates. To change this mindset would be to throw away money.

If we regard every financial interaction in terms of game theory and strategies for winning, "fairness" doesn't come into the equation. (Note: the more equations, the better!)

So here's a proposal as to how we can have more thoughtful conversations about big data and fair lending issues. All parties need to distinguish between the goals of the lender and the goals of the general public. The lender's main goals are accurate data and profit. The public's goal is to have a financial system that does not exacerbate current inequalities or send people into debt spirals. The former goal is individualistic at its core; the latter goal pertains to broad groups of people.

With this framework in mind, we can go ahead and discuss the pros and cons of the newest big data ideas and how they may inform the future of financial services. But we must remember to look at each idea through both lenses in order to have reasonable and productive conversations.

Cathy O'Neil is a mathematician who blogs about quantitative issues at mathbabe. An earlier version of this post appeared on her blog. Follow her on Twitter @mathbabedotorg.

For reprint and licensing requests for this article, click here.