As regulators see it, banks' own computer models provide  the perfect gauge of how much capital to set aside for big swings in the   value of securities and derivatives.   
But some bankers and risk management experts aren't so sure that the  models, which use historical market data and statistical projections to   estimate how much the value of a bank's portfolio can rise or fall, can be   translated into fair capital standards.     
  
A group of Federal Reserve Board economists agrees, and has proposed a  radically different approach. 
For now, only the 25-odd big U.S. banks that will have to meet new  capital standards starting in 1998 are affected by this debate. But the   outcome will also say a lot about whether regulators can ever extend their   capital adequacy standards, which now factor in only credit risk, to cover   the array of risks faced by banks.       
  
Behind the controversy is the Basel Committee on Banking Supervision,  which agreed on risk-based capital standards for credit risk in 1988 and   has been struggling ever since to add what it calls "market risk" to the   equation.     
Originally the Basel Committee - made up of banking supervisors from 12  wealthy nations - proposed a standardized market risk capital formula. But   U.S. banks and bank regulators objected, and in April 1995 the committee   proposed letting banks choose between following the regulatory formula or   using their own computer models to calculate how much capital they must set   aside. Last month, the Basel panel formalized this two-pronged approach as   an amendment to its 1998 capital accord.           
The internal models approach has clear advantages.
  
Complex models are now used to give bankers simple, understandable  "value-at-risk" numbers that quantify how much money they could make or   lose over a set period of time. These computer simulations allow banks to   keep track of derivatives, securities, and foreign exchange trading risks   in ways that couldn't have been imagined just a few years ago - and in ways   that a standardized regulatory formula never could match.         
Allowing banks to use their existing computer models could therefore not  only save them from having to fill out a lot of paperwork, but would have   the potential for setting capital standards more precisely and fairly.   
Also, the banks and software companies that measure value-at-risk agree  on general principles of how it should be done. This is not the case with   interest rate risk modeling, which is one reason why the banking agencies   have virtually given up on developing a rate risk capital standard.     
But the value-at-risk models do have their differences, and their blind  spots. 
  
"Value-at-risk is a very valuable tool," said Tanya Styblo Beder, a  principal in the New York consulting firm Capital Market Risk Advisors.   "However, if it is misused or misunderstood in terms of its limitations, it   can delude people into thinking all is well when all is not well."     
What are the computer models' limitations? They are short-term in nature  - most banks' models assume that securities and derivatives will be held   for only a day. Banks also must make a lot of assumptions and omissions to   fit widely varying assets and liabilities into one computer model.   Different ways of calculating value-at-risk can deliver much different   results. And so far no one has found a way of programming into the models   how much risk a bank is likely to take, or how well its risk management   systems will work.             
"Even putting aside limitations in estimating one-day risk exposures,  the models are not designed to measure the longer-horizon exposure that is   the intended basis of a regulatory capital requirement," wrote Federal   Reserve Board senior economists Paul H. Kupiec and James M. O'Brien in a   paper published in December.       
As a result, some bankers think rules based on value-at-risk will  overstate how much capital they need. 
"To extrapolate from a daily value-at-risk to the capital required for  annual losses may not be valid," said Evan Picoult, a managing director of   risk analytics at Citibank. "It would tend to ignore internal risk   management procedures which would limit losses."     
That view is not universally shared. Tanya Azarchs, a director of  financial institutions ratings at Standard & Poor's, argued in a company   publication last summer that a standard based on value-at-risk will   underestimate capital needs.     
"The method is not designed to project the maximum loss a firm could  suffer during any financial reporting period, particularly bear markets,"   she wrote.   
The regulators who helped frame the market risk standards say they took  these limitations into account. 
"That's baked in the cake in this approach," said Christine Cumming, a  senior vice president at the Federal Reserve Bank of New York, at a Federal   Deposit Insurance Corp. derivatives conference Feb. 9. "It's tailored to a   bank's specific risk profile; it will evolve with the bank's specific risk   profile."       
The market risk standard agreed on by the Basel Committee forces banks  to work within certain parameters in setting up their computer models, then   multiplies the daily value-at-risk figure to come up with a capital charge.   
It also now includes a detailed "back-testing" formula that requires  banks that haven't done a good job of predicting the volatility of their   portfolios to post extra capital. The FDIC, Fed, and the Office of the   Comptroller of the Currency, which issued most of the market risk plan as a   proposed rule last July, plan to publish the back-testing section for   comment within a few weeks.         
Ms. Beder, a former derivatives trader, said the back-testing  requirement is a step in the right direction. But she added that coming up   with a consistent, accurate quantitative measure for the risks associated   with derivatives trading and other market activities may not be possible.     
In a study published last fall, Ms. Beder tried eight different value-  at-risk modeling techniques on the same portfolio and came up with one   result 14 times higher than another.   
Ms. Azarchs of Standard & Poor's surveyed money-center banks' 1994  annual reports and found that the value-at-risk figures they reported bore   little relation to the actual volatility of their trading revenues.   
Regulators - who tried to smooth out these differences by dictating  boundaries within which banks must make their value-at-risk calculations -   have drawn fire from money-center banks. The banks contend that the   regulators are making the internal models approach just as rigid as the   standardized approach.       
"As soon as you try to get consistency, you take away the flexibility of  the internal models," said Hannah Sorscher, vice president of global   derivatives at Citibank.   
Said Ms. Beder: "There's a lot of art to the science of risk management.  I think from a regulatory standpoint, that's where you have to start." 
Susan Krause, senior deputy comptroller for bank supervision policy at  the OCC, said regulators tried to factor a little of the "art" of risk   management into the market risk rule by requiring banks that want to use   internal models to meet certain qualitative standards.     
But Ms. Krause, while defending the market risk capital plan, agreed  that capital standards have their limits. 
"I do feel that we sometimes put too much emphasis on capital as being  the primary risk management tool," she said. Instead, she said, regulators   need to pay more attention to banks' risk management systems - something   the OCC and Fed have already started to do in their examinations.     
For now at least, the agencies are also moving ahead with the new  capital requirements - in part because the capital charges envisioned   simply aren't all that big. Using value-at-risk numbers from 1994 annual   reports, Ms. Azarchs of Standard & Poor's calculated that J.P. Morgan's   market risk capital charge would add up to $211 million, less than 3% of   its 1994 equity capital. Citibank's market risk charge would be $718   million, 5% of 1994 equity capital.           
At the Fed, however, another approach has been percolating. Devised by  Mr. Kupiec and Mr. O'Brien, this "precommitment approach" would let banks   decide for themselves how much capital to set aside for their trading   activities, then face the consequences - fines, increased regulatory   scrutiny, or embarrassing public disclosure - if their trading losses   exceeded the capital cushion.         
Patrick Parkinson, associate director of the Fed's division of research  and statistics, started Mr. Kupiec and Mr. O'Brien on this path by   assigning them to figure out how to validate banks' value-at-risk models   for use in setting capital standards.     
"They had been asked to push the internal models approach along, but  instead went off in another direction," Mr. Parkinson said at the Feb. 9   FDIC derivatives conference.   
The Federal Reserve Board enthusiastically endorsed the precommitment  approach at its June 21 meeting. Big U.S. derivatives banks joined in the   praise in comment letters to the Fed.   
But the idea won little support on the Basel Committee, and Mr.  Parkinson said the Fed will hold off on it until the Basel market risk   rules have been finalized by U.S. regulators.