adopting the trading capital rule, the latest in a series of regulations designed to more effectively measure market risk and guard against it. Now regulators must address the methods banks use to measure risk. In our experience, one popular method - Monte Carlo Simulation - has some serious drawbacks. But a potentially better approach, term structure modeling, is already available. The regulatory initiatives reflect the trend toward measuring and managing risk on a market-to-market basis. Thoughtful bankers have come to believe that is the only comprehensive way to accurately measure all key risks embedded in a bank balance sheet. But the regulations are not watertight with respect to ensuring the accurate measurement and hedging of market risk. Because many financial theories and analytical methods are employed by the banks to drive these models, the risk measurement results they generate create inconsistency and span the spectrum of accuracy. Almost everyone acknowledges that the measurement of market risk is a fairly straightforward process, so long as a bank's balance sheet is restricted to fixed-rate term deposits and fixed-rate loans that are not prepayable. But the internal models cannot adjust for the the optionality that is pervasive in the typical bank's balance sheet today. Almost every retail banking product allows the retail customer to exercise some option. Borrowers may prepay, and lenders may withdraw funds early. Commercial loans are prepayable at any time, despite restrictions to the contrary, if the commercial borrower is a valued bank customer. Banks are active issuers of funding instruments structured with "American" options to call or put. And banks actively purchase, sell, and design all forms of derivative securities to manage risk.

The potential consequence of relying on faulty analytics to measure option risk can be a risk manager's ultimate nightmare: running a perfect hedge on the "wrong number." Such mistakes are not unprecedented and have resulted in losses totaling millions of dollars for some of the world's largest banking institutions. Since the early 1980s, Monte Carlo simulation has been a widely employed approach for valuing options and other derivative securities when exact numerical formulas are not available. Monte Carlo simulation uses random numbers to sample many paths an option can follow, discounting the values derived. Then the results are averaged to estimate the value of the option. It is often the only way to value complex securities on which the payouts at any point in time depend not only on current interest rates but on the exact path interest rates take to reach that level. Analyzing adjustable rate mortgages presents this kind of "path dependent" problem. It's a good example of when Monte Carlo simulation is appropriate as a modeling technique, lacking a better alternative. Nevertheless, Monte Carlo simulation has many serious flaws, and the individual who relies on Monte Carlo-based software and analytics to manage a bank's risk position should do so with extreme caution. One well-documented flaw of Monte Carlo simulation is that it can never produce an accurate value for any financial instrument with an embedded American option, such as a prepayable mortgage, a deposit subject to early withdrawal, and most types of swaptions. There is no rule in the calculation that describes what the holder of that option should do. This rule has to be imposed externally, usually in the form of a prepayment table. But this is a classic Catch-22: to know the prepayment one will experience, one has to know how consumers value the option to prepay; to know how the consumers value the option to prepay, one needs the prepayment table. Another problem with Monte Carlo simulation is its inaccuracy in hedging, due to the sampling error inherent in the technique. The following example shows the impact of sampling error on the practical use of Monte Carlo. If we perform 100 simulations and value our portfolio at $100, with a sampling error of $2, it would mean that $100 is only an estimate of the true answer, and the standard deviation around the estimate is $2. The true answer will be within plus or minus $2 - that is, between $98 and $102 - with 65% probability. If we then stress-tested the portfolio by shifting the yield curve up to calculate interest rate sensitivity, another 100 simulations would produce a stress-tested value of $99 and, again, a sampling error of $2. At first glance, we would have a "delta" of $1 and could calculate a hedge that has a delta of minus $1 to neutralize our interest rate risk. But there is more to it than that. There would be a sampling error of 2.828 around the "delta" of 1. This means that if a hedge is based on the "delta" of $1, there would be a 36.2% chance not only that the hedge amount is wrong, but that the hedge itself is in the wrong direction! In order to mitigate this hedging problem, users of Monte Carlo simulation need to run at least 2,000 to 10,000 iterations to reduce the inaccuracies from sampling error to an acceptable level. The extensive amount of time and computational requirements for such comprehensive simulation analysis supports the conclusion that Monte Carlo simulation can be quite impractical. Luckily, a faster, more accurate option valuation is readily available. This approach, "term structure modeling," incorporates a building-block approach of three simple steps: *Fitting observable market data with a continuous smooth line to form a security valuation yield curve. *Selecting a "term structure" model and the appropriate modeling parameters to specify how interest rates move randomly in the future. *Employing a numerical calculation method to value the instrument with the embedded option. This approach produce significantly more accurate values than those from Monte Carlo simulation. And the run times per option valuation can be completed in a fraction of a second on personal computers. In addition, with a "term structure modeling" approach, hedge calculations are accomplished simultaneously with the valuation, so that no additional run time is required. This reduces calculation time from hours to minutes. We wholeheartedly applaud the regulators for the tremendous progress they have made during the last year in requiring banks to adopt mark-to- market valuation practices in managing risk. We recommend that the next set of regulatory initiatives carefully examine and establish guidelines for the analytical methodologies employed by banks for their internal models. This will enhance the meaning of the analysis generated by the banks and greatly reduce the probability of error by ensuring that less accurate techniques such as Monte Carlo simulation are only used when appropriate. Mr. van Deventer is president and Mr. Levin is director of U.S. operations for Kamakura Corp. The New York investment banking company also offers risk management consulting and software development.