BankThink

Something’s missing from the debate over CECL

There’s a lot being written right now about the Current Expected Credit Loss accounting standard, or CECL — but scant attention is going to a critical challenge facing many community banks in implementing a CECL solution.

The problem comes from the focus — from regulators, accountants and consultants alike — on the voluminous data required for CECL compliance and on what to do about it all. Yet what many community banks face is actually a dearth of historical loss data.

It’s fine to lay out a laundry list of required data fields, as CECL consultants are doing, but if loss experience is limited and concentrated, then there’s going to be a lot of empty fields. And banks facing this predicament are not going to be able to do the kind of analytics being talked about — vintage analysis, migration analysis, probability of default/loss given default analysis and others — in any meaningful way. Software vendors are peddling CECL model solutions, but those models aren’t going to be much help without the required bank-specific data to run through them.

Further, for smaller, newer institutions, the problem is compounded by the absence of sufficiently granular peer loan-loss databases to draw on — not even to purchase. The handful of databases that are available out there are populated by and pitched at the very large banks.

And so, for the small, data-constrained banks — those who’ve managed their portfolio too well to have had losses now to work with or who just haven’t been around long enough — there’s neither internal nor external loan-level data to mine with any reliability. In order to project expected losses over the life of a new loan so as to make appropriate loan loss provision at origination (as CECL will require), a bank needs to understand how loss potential is impacted by key characteristics of both the borrower and the loan itself: the industry, key financial ratios, the assigned risk grade, collateral structure, etc., etc., and how they all interact over time. Ideally expected loss is calibrated against the bank’s own experience with similar borrowers and loans; if not, then it should be external (i.e., peer) loss experience.

Migration analysis, looking at migration patterns down through risk grades towards (even if not all the way to) default, can be some help. But only some. Without enough loans having gone to their grave, it won’t be possible to fully understand the drivers of loss dynamics, the kind of key (borrower and loan) characteristics indicated above that will help a bank quantify the longer-term, life-of-loan, expected loss potential on a newly originated loan. It’s default and charge-off that’s the end-game, not some broader indicator of portfolio weakness. Many loans will show some signs of weakness during a recession, for instance, but most of them will have the capacity to withstand some stress and continue to repay.

The resulting reality for these institutions is that CECL will likely end up translating into only incremental enhancements to the current methodology with its typical reliance on Federal Deposit Insurance Corp. average loss rates by loan type, as now used to augment segments or portfolios with limited internal loss data. That data, whether accessed by Uniform Bank Performance Reporting, Statistics on Depository Institutions or other tools on the FDIC’s website, is a terrific resource that has helped many banks comply with requirements under the existing (incurred loss) methodology. But it does not provide the range of variables required, and at the necessary level of disaggregation (i.e., loan by loan), in order to meet the more exacting demands of CECL.

In practical terms for CECL implementation for such institutions, compliance will translate into two key sets of activities. First, it will require extending the look-back period beyond the typical 3 – 5 years for deriving historical loss rates, or HLRs, consistent with CECL’s “life of loan” horizon. Second, it will include using the qualitative & environmental factors (“Q-factors”) to go from the now-standard 12-month loss recognition period out to the remaining life of loan for purposes of estimating the potential loss rate at a new loan’s inception. These banks will probably still be using the same loan segments for their general reserving, and the same basic HLR and Q-factor constructions, even if the data (expanded to include macro forecasts) is sourced and entered a little differently. By default, the FDIC will continue to be the primary source of historical data for these banks.

Of course, putting in place systems and processes to capture the requisite data going forward is important. That applies to future losses the bank may sustain; if that data collection is started today, it will mean at least three years of detailed loss history to draw on by the time CECL implementation rolls around. But three years is not a full cycle, and capturing lots of detail on future new loans in order to book the required CECL loss provision at the date of origination isn’t going to go very far without the supporting historical data and analysis to understand the implications for potential future losses of all the new loan data that could be gathered.

Yes, CECL will be a sea change for larger institutions, as it will be for those community banks that have been around for a while and/or that tripped up in the last recession. But for many others, those that find themselves subject to these loss data constraints, the changes will by necessity be far less dramatic and more easily accommodated. With most banks currently holding reserves well above recent or reasonably anticipated loss levels, supported by liberal application of Q-factors and/or by a cushion of unallocated reserves, that is less of a concern.

What is a concern is what we’re starting to hear voiced in conversations with chief executives and chief credit officers, and which I expect we’ll hear quite a bit more of as CECL dates draw closer: frustration over a misalignment between what’s expected and what’s achievable. Requiring banks to run new models without their having the data to feed such models is not a recipe for productive outcomes. While the goals of CECL are certainly reasonable — loss reserving should reflect statistically expected future losses on a loan, not just losses already incurred — many smaller community banks feel pressured to enact a solution beyond their reach. At this point we’re not yet seeing much recognition of the dilemma this poses, and of the realistic need for a more pragmatic, less data-intensive approach to calculating CECL-based reserve requirements.

More useful guidance and feedback is needed for this segment of the industry, a segment that’s so often the recipient of both legislative and regulatory lip service.

For reprint and licensing requests for this article, click here.
CECL Community banking Accounting methods
MORE FROM AMERICAN BANKER