Basel III, with its requirement that banks risk-weight their assets to calculate their capital levels, will bring about sweeping changes in the way banks assess the credits on their books. Banks may need to rely less on the types of stochastic models they've used in the past, which plot hundreds of possible scenarios to determine a reasonable probability of default.

"The original expected default frequency model is an application of the Black-Scholes-Merton view of credit risk that's now three decades old," says David Hamilton, managing director of quantitative credit research at Moody's Analytics. The expected default frequency model many banks use has been around for two decades and is a point-in-time model, Hamilton says. "Every day it's recalculating the probability of default based on the most recent data available, which often is equity market information," he says. "That point-in-time view takes into account everything that's relevant for evaluating a firm's probability of default. It includes the kitchen sink — not just the firm's own long-term credit risk, but macroeconomic factors, macrocredit factors, industry effects and geographical effects."

But the Basel III accord calls for risk measures that consider cycles rather than a point in time, Hamilton says. "The reason they called for that is because regulators do not want banks to use probabilities of default inputs into their capital model. They'll result in pro-cyclical capital, which means that in a boom period you'll have relatively low capital," he says. "In a downturn, your provision for capital goes up as risk factors go up. That's a problem because in a downturn capital gets more expensive."

Such a through-the-cycle view gives banks a more accurate signal about how their capital reserves should change over time and whether the environment is becoming more or less risky.

Another factor that doesn't get talked about often is that the models that banks and other companies used to estimate the pricing of credit-based instruments such as collateralized debt obligations and collateralized mortgage obligations failed them when actual credit defaults, especially of mortgages, defied all predictions, computer or human generated. One crucial problem was the underlying data. Risk managers were feeding housing and mortgage data from the past 75 years into their models, which turned out to be irrelevant because there hadn't been a severe drop in housing prices or an increase in mortgage defaults in the past 100 years.

In addition to modifying credit risk models themselves, banks will need to invest time and technology into validating and back testing their model results, says Ioannis Akkizidis, senior financial risk analyst at Wolters Kluwer Financial Services.

"Regulators have said not all existing models are robust enough, so they propose to do stress testing of the input parameters to validate the results and also some back testing," he says. "They want to do stress tests on the quality and behavior of counterparties, especially the ones that are highly rated. Some of the changes in terms of validating the models are going to be redefined."

Back testing done with data from the past three years will yield dismal results, since three years ago is when the financial crisis started, Akkizidis points out.

The regulators about to enforce Basel III are emphasizing transparency, Akkizidis says. "It's very important for the regulators and also for the bank and the market to have transparency in the models," he says.

Regional banks may be the ones most interested in using new credit risk models, experts say. "Presumably the largest, most sophisticated banks have tried to tackle this on their own," Hamilton says. "They clearly have the resources to do it. It's the more medium-size banks that don't have the resources of lots of quants to read through the cycle data." Under previous iterations of Basel, regional banks had the option to run standardized methods for calculating risk-weighted assets, which is easier than coming up with their own risk models. But such models could put them at a competitive disadvantage from a capital perspective. Banks that use advanced, internal models may be able to gain more favorable risk weightings.

Danske Bank, Denmark's largest bank, has begun using SAS Risk Management for Banking as its platform for calculation of economic capital. The bank wanted a new platform to connect risk and capital calculations analysis. Analysts at the bank can work without IT involvement and risk calculations can be performed by one unit using an integrated approach.

"We are working on developing a better interaction between the bank's risk experts and the bank managers, as well as the interaction between IT and business in this area," says Simon Haldrup, head of the risk and capital management unit. "One important element is that we are able to manage all parts of our calculations and develop, change and understand the analysis behind the figures ourselves. We want to drill down and understand the dynamics, and at the same time become better at compressing and communicating this knowledge to the management of the bank."

The long-term goal at Danske Bank is to build a risk-conscious culture. "We have to be razor-sharp on recognizing and understanding the risk we are exposed to," says Haldrup. "It is about telling the good customer from the bad customer, and knowing the risk-adjusted profitability of the customer. We can measure and analyze it, and we have to move that way."