Regulators warned at an industry conference this week that lenders remain on the hook when it comes to racial bias in lending, even though mortgage lending is becoming more automated.
"Credit scoring is not a blanket of immunity," said Peter M. Kaplan, an associate in the Department of Housing and Urban Development. Lenders are still liable if minorities or other groups are unfairly denied credit, Mr. Kaplan said. He spoke at an annual fair-lending conference of the Mortgage Bankers Association.
In the search for potential racial bias, regulators want to be sure that the statistics-based scoring systems do not codify racially biased lending practices.
Even lenders that use credit-scoring systems developed by others are liable if the systems produce racially biased results, Mr. Kaplan said.
He advised lenders to ensure that the pool of information used to create the scoring systems include credit applications that are accepted and those that are denied. If only creditworthy applicants are included, Mr. Kaplan said, the system may not work for the whole population, especially since minorities are more likely to have been denied credit.
In setting a cutoff score-below which applicants are not automatically granted credit, but referred to an underwriter-Mr. Kaplan suggested lenders find "an appropriate balance" between their need to screen out additional defaults and the effect on minorities and other protected classes of a particular cutoff score.
He stressed that scoring systems must be updated regularly with new information on borrower performance. "Even if it's properly designed when you start using it, it's critical that you monitor the impact of the system and feed back what the performance is" into the system, Mr. Kaplan said.
Sharon McHale, a Freddie Mac spokeswoman, said her agency was confident that its scoring models, widely used by mortgage lenders, treat borrowers of all races and income levels fairly.
Ms. McHale said Freddie Mac's scoring models are based on racially diverse pools of mortgage borrowers. Moreover, mortgages to minorities approved by the models default as frequently as those made to other groups. That suggests that the models are not tougher on minorities, Ms. McHale said.
Regulators also want to be sure that bias does not creep in when underwriters are given leeway to supplement the automated decision-making process.
"With mortgage applicants, there's often a lot of interaction between the applicant and mortgage officer," said Joan Magagna, acting chief of the civil rights division at the Department of Justice.
Ms. Magagna said it was important to be sure that "arbitrary" decisions are not made during those interactions.
She gave examples of patterns the Justice Department would find troubling: Lenders override low-and high-credit scores, if they believe the score does not reflect the borrower's true risks or strengths. But such discretion must be "applied evenly and across the board," Ms. Magagna warned. "We'd be concerned if more whites than blacks were getting loans despite the low scores," she said.
She said tiered, or risk-based, pricing must be consistently applied across all groups, and lenders must be able to show that borrowers who are charged higher rates actually represent potentially greater risk and additional costs to the lender.