Fintechs seek CFPB guidance on making AI-based lending fair

Complimentary Access Pill
Enjoy complimentary access to top ideas and insights — selected by our editors.

Six fintech companies and the National Community Reinvestment Coalition are asking the Consumer Financial Protection Bureau to provide guidance on how to use artificial intelligence in lending decisions without running afoul of fair-lending laws.

In a letter sent Tuesday to the CFPB, the six companies — Affirm, LendingClub, Oportun, PayPal Holdings, Square and Varo Bank — publicly commit to uphold the “disparate impact” legal doctrine that has been widely unpopular with financial institutions. The legal theory of disparate impact, which is a component of the Fair Housing Act and other fair-lending laws, maintains that a policy is illegal if it has a discriminatory effect on a protected class, but it may not be illigal if there is a business purpose that serves a “substantial, legitimate, nondiscriminatory interest.”

Consumer Financial Protection Bureau building
Signage is displayed inside the Consumer Financial Protection Bureau (CFPB) headquarters in Washington, D.C., U.S., on Monday, March 4, 2019. House Financial Services Committee Chair Maxine Waters will hold a hearing this week on the semi-annual review of the CFPB. Photographer: Andrew Harrer/Bloomberg

The letter marks the first time that a group of lenders has sought common ground with consumer advocates on how to make the disparate impact legal theory apply effectively to lending models that use artificial intelligence, machine learning algorithms and alternative data.

“Disparate impact addresses discrimination that can arise when decisions are the result of algorithms or data, rather than human intent,” the letter states. “We appreciate disparate impact’s statistical, outcomes-based approach to identifying discrimination. We also believe this outcomes-based approach establishes disparate impact as a pro-innovation framework for preventing discrimination.”

The fintechs' request for guidance comes Just days after the Department of Housing and Urban Development rescinded a Trump administration rule that had weakened disparate impact.

The letter is a response to a request for information in March by five federal regulators on the use of artificial intelligence and machine learning that could be a first step toward an interagency policy. The comment period for that public request for information was extended to July 1.

The letter also comes as the CFPB is determining how to update the the Equal Credit Opportunity Act, which prohibits discrimination in credit and lending decisions. The bureau issued a request for information a year ago on whether the 1974 law and its implementing regulation, Regulation B, should be overhauled in light of two separate Supreme Court decisions on gender identity discrimination and disparate impact.

The Equal Credit Opportunity Act covers intentional discrimination and the CFPB has asserted that unintentional discrimination, primarily the disparate impact theory, also applies under the ECOA. Many experts dispute that view. The Supreme Court has asserted that the Fair Housing Act includes a disparate impact standard.

The letter asks the CFPB to specifically state that disparate impact applies to models that use artificial intelligence. The companies also seek clarity about disparities in loan decisions and when they rise to the level of potentially constituting grounds for discrimination. The six fintechs are on the NCRC’s Innovation Council for Financial Inclusion. The NCRC is a national network of fair lending, fair housing and consumer rights advocates.

“We want to lead the financial services industry in the direction of an effective use of fair lending analysis and an effective disparate impact framework to get this right,” Brad Blower, the NCRC’s general counsel, said in an interview Tuesday.

The fintechs and the NCRC also want the CFPB to create clear guidelines for when and how lenders should seek to find alternative models or data. They also ask the bureau to explain the extent to which lenders can collect demographic data for their own fair lending compliance reviews.

Several of the fintech CEOs publicly affirmed their commitment to disparate impact regulations as an essential safeguard against racial bias in financial services.

“We know that a history of racism in financial services contributed to much of the economic inequality we see today,” Colin Walsh, Varo Bank's CEO, said in a press release.

Others said that disparate impact rules are critical for both society and industry.

“Strong regulatory protections against discrimination in lending are critical for a just society, and for technology innovation to succeed,” said Armen Meyer, LendingClub’s head of public policy. “Fintech has united in support of strengthening these regulations that address the risk of digital redlining.”

The fintech lenders said that guidance by the CFPB on how anti-discrimination laws will impact digital lending would also give investors confidence to invest in technology.

“The CFPB can protect consumers from digital discrimination by speaking directly to the public on how it will enforce fair lending practices in an increasingly digital-first marketplace,” Jesse Van Tol, the NCRC’s chief executive, said a press release. “We cannot stand by and allow algorithms to resurrect old biases in new packages, or introduce new forms of discrimation hidden in proprietary code.”

The commonplace use of artifical intelligence and machine learning systems, makes it "critical that [the CFPB] updates its regulations to hold lenders accountable to building underwriting systems that prioritize the rights of consumers, and particularly for historically disadvantaged and underserved groups," Van Tol said. "We cannot allow complexity to be either an opportunity or an excuse for digital discrimination.”

The fintechs also may be jockeying to stake out their positions before Rohit Chopra, the Biden administration's nominee to the lead the CFPB, takes the helm of the agency upon Senate confirmation. Chopra, a commissioner at the Federal Trade Commission, has written extensively about technology companies and many experts say he will want to look under the hood of the black-box algorithms being used by many lenders that promote "inclusivity."

For reprint and licensing requests for this article, click here.
Fintech regulations CFPB Redlining Differentiated Data and Advanced Analytics
MORE FROM AMERICAN BANKER