BankThink

Can Banks' Volcker Metrics Be Trusted?

This is the fourth article in an eight-part series.

Be careful what you wish for.

Banks are confronting the wisdom of this idiom as they face up to the enormous technological demands necessary to comply with the Volcker Rule. Their extensive lobbying against the rule helped them win numerous market-making, hedging and underwriting exceptions. Now risk managers and IT personnel are left struggling to deal with the rule's significant granular data requirements.

If the Volcker Rule could really be implemented in a timely manner, it would impose upon banks a badly needed due diligence framework for monitoring their securities and derivatives trading, along with their hedge fund and private equity investments. Yet what I observe is that the challenges of data acquisition and making necessary technological changes constitute significant operational risk exposures for banks trying to implement the Volcker Rule, along with the difficulties posed by people, processes and external events such as outsourcing lawyers and IT vendors. Numerous research reports demonstrating that banks are having serious data and technology problems only reinforce my on-the-ground perceptions about both domestic and foreign bank organizations here in the US.

To qualify for exemptions, banks need to design their trading desks' inventory not to exceed the reasonably expected near-term demands of customers. Simply figuring out what constitutes reasonable near-term demand takes time. Banks must provide demonstrable analysis of historical customer demand. But since banks had not been collecting data to prove the purpose of their transactions before the Volcker Rule was passed, they are now struggling to recreate history and find reliable information.

Other parts of the rule are also favorable to banks but are nonetheless data-intensive. For example, portfolio hedging, which was a major cause of JPMorgan Chase's London Whale scandal, is still permitted. However, the Volcker Rule requires banks to provide additional contemporaneous documentation upon trade execution in order to prove that the hedges really are mitigating risk in the relevant situation. 

Another significant problem that predates the Volcker Rule is that manual processes still abound in trading — especially for tailored over-the-counter derivatives and illiquid securities. For example, traders still input some transactions into spreadsheets and email the spreadsheets between different departments and legal entities, and even between countries. With these types of processes still in place, it's no wonder that risk managers and IT professionals are significantly challenged by the prospect of figuring out what new systems are necessary and how to modify existing ones.

Legacy systems that are not so easy to adapt to the granularity of the Volcker Rule and other Dodd-Frank and Basel III rules also represent a significant obstacle. Banks often have multiple, disparate systems for securities and different derivatives. This exacerbates data accumulation and aggregation, which is now even more important under the Volcker Rule. Often the systems do not facilitate communicate between front, middle and back offices. This makes it difficult to automate key risk data in the way that is necessary for day-to-day trading, not to mention Volcker Rule quantitative metrics.

Bankers are well aware of these problems. A bank professional at one of the largest foreign bank organizations in the U.S. recently told me that "systems have not been updated to have the necessary features to accumulate the data in a way necessary to report Volcker metrics and to be granular enough to report by trading desk."

Part of the reason banks are lagging behind in their compliance timelines is that it has only been 10 months since chief risk officers and chief data officers have known the final rule. Moreover, some vague parts of the rule and components that are subject to multiple interpretations are making it difficult for risk and data officers to decide the type of systems they want, how they need to change data accumulation, validation and reporting, and how to assign the right personnel to the right posts. In addition, numerous government lawsuits over alleged rate manipulations and improper foreclosures are diverting management attention from prioritizing IT challenges surrounding the Volcker Rule and capital rules.

Ideally, banks need a single platform that can be automated for data accumulation and aggregation. And since data governance has taken on additional importance with the Volcker Rule, the platform needs to be robust enough to provide an audit trail for risk managers, auditors and regulators.

Many of the largest banks with over $50 billion in assets are building their own in-house systems. However, it is questionable how well those systems are integrated with existing systems at the firms. Even more questionable is how smaller firms will cope with the cost and complexity of new technology demands.

It is high time that regulators, financial journalists and analysts ask themselves whether they can really trust that the very banks struggling to monitor their counterparties, failing stress tests, making accounting mistakes and running into regulatory reporting problems are really capable of reporting Volcker quantitative metrics accurately and in a timely manner.

Next in the series: The Volcker Rule and coping with external events.

Mayra Rodríguez Valladares is managing principal at MRV Associates, a New York-based capital markets and financial regulatory consulting and training firm. Follow her on Twitter at @MRVAssociates.

For reprint and licensing requests for this article, click here.
Law and regulation Dodd-Frank
MORE FROM AMERICAN BANKER