The mortgage industry is undergoing unprecedented transformation as a result of heightened regulation along most parts of the business, owing in large measure to a self-inflicted wound sustained during the housing boom.
Signs of how far the industry strayed from prudent underwriting include the Consumer Financial Protection Bureau's efforts to set the standard for underwriting quality through its qualified mortgage rules; the recent notice of proposed rulemaking on appraisal requirements for riskier mortgages; and the aggressive putback campaign waged by Fannie Mae and Freddie Mac.
The roots of the crisis stretch back to 1996 with the introduction of Freddie's Loan Prospector automated underwriting system followed shortly by Fannie Mae's Desktop Underwriter. (Disclosure: I worked for Freddie Mac during this period and Fannie Mae a little earlier). This new technology in the hands of both government-sponsored enterprises ushered in a remarkable evolution of the traditional mortgage business, which up to that point had relied on humans to underwrite the borrower, verify her income and assets and fully appraise the property. Innovation in the form of statistical modeling enabled the GSEs to construct an algorithm that could distill the 3 C's of underwriting (creditworthiness, collateral, and capacity) into a single score reflecting the borrower's probability of default, allowing an originator to communicate an almost instant loan decision to the applicant. Such tools were designed in part to control credit risk upfront and also to provide significant efficiency in the mortgage process. As more AUS scores passed the GSE-designated threshold of acceptable quality, fewer loans had to be reviewed by the more time-consuming human underwriting process.
It may seem surprising now, but a major consumer issue back then was the time and effort it took a borrower to secure a mortgage. Paperwork requirements were onerous and costly in addition to the delay in processing all of this information. The game was now afoot.
AUS systems could effectively order risk along a continuum and had advantages in the area of fair lending by providing an objective and consistent means of evaluating borrowers. Over time, as the industry became increasingly comfortable with the results of these tools, companies kept tinkering with the acceptable risk profile. More loans were deemed acceptable and to no surprise that also meant more loans that could be processed more quickly. Even the Federal Housing Administration got in the AUS business and continues to extensively use this technology today.
Remember that in a highly commoditized business as the conventional conforming mortgage market was, there was little product differentiation. Thus originators increasingly looked to compete on service.
Next came appraisals, where the ability to apply statistical modeling to property valuation led to automated valuation models. The combination of AUS and AVMs eventually led to a tiered application of appraisal alternatives. Full appraisals were reserved for the riskier-scoring loans while higher-quality AUS loans could receive streamlined appraisals of various types.
With credit and collateral streamlining addressed, the last bastion of the underwriting process, capacity to repay the obligation, came under scrutiny. With sufficient credit and collateral (referred to as "compensating factors"), a borrower could enjoy further streamlined processing by undergoing a more limited assessment of their income, assets and employment status. These programs started off innocently enough, with low documentation requirements reserved for refinancings where the lender had previous experience with the borrower's payment pattern.
Over time, the application of these new underwriting techniques had two effects on the market that presaged its demise. First, and most importantly, it allowed the industry to distance itself from the basic blocking and tackling of mortgage underwriting. Processes and controls that had worked for decades relying on humans gave way to statistical models that are only as good as the last several years of data on which they are built.