BankThink

Accepting What You Don't Know Is Crucial to Detecting Risk

One remarkable feature of the 2008 financial crisis is that the vast majority of risk experts were caught largely unaware of the extreme fragility of our financial system. Most analysts remained bullish about the housing market in the months leading up to the crisis, and even among those who saw the potential for a slew of foreclosures, very few recognized that mortgage defaults could destabilize the global financial system.

After the financial crisis, Congress passed the Dodd-Frank Act requiring stricter oversight of large financial institutions including stress testing, which uses complex statistical models to assess whether large banks have sufficient capital reserves to survive a hypothetical severe recession. Tighter supervision and improved risk analysis are important, but it is also worth noting that overconfidence in sophisticated statistical models also contributed to the financial meltdown.

To avoid repeating similar mistakes, financial risk professionals should look to incorporate lessons from the physical and behavioral sciences on the study of ignorance. The fundamental message of that body of research for the financial industry is that a strong and dynamic risk detection system requires focusing on the most important blind spots not satisfactorily explained by existing tools, rather than focusing on the "answers" produced by those tools.

Focusing on anomalies and unknowns is particularly important for financial risk analytics given how risks emerge in the marketplace. Financial market risks evolve and grow precisely because they are risks that avoid detection by the prevailing measurement systems. This dynamic process sometimes occurs consciously as individuals find ways to circumvent methods put in place to prevent bad behavior. The sudden collapse of Barings Bank in 1995 is a spectacular example where a rogue trader successfully circumvented the bank's limits on trading exposures. Today's bad actors may be able to do far greater damage in the form of cybercrime or cyberterrorism. Cybercriminals are constantly probing to discover vulnerabilities in existing defense mechanisms that protect vital digital networks.

There are also dangerous risks besides purposeful attempts to circumvent defensive measures. Today new highly profitable financial innovations are constantly emerging. It is inevitable that in some cases the risk of a new activity will be underestimated, resulting in rapid growth in that product or service. This can quickly turn into a dangerous spiral where those who promoted and profited from growth in the new activity rationalize and defend prior decisions and downplay signs of excessive risk.

The extraordinary growth of complex mortgage securities exemplifies the processes described above. Look no further than the collateralized debt obligations that were at the heart of the financial crisis. CDOs had been around for some time and were not considered risky. However, the nature of CDOs changed dramatically between 2002 and 2007. A typical CDO in 2002 was a diversified security comprised of bonds from a wide variety of loan types, including auto loans, credit card loans and residential mortgages, with mortgages representing approximately 15% of a typical CDO. By 2006, poorly rated mortgage bonds made up over 90% of a typical CDO.

CDOs were perfect financial "weapons of mass destruction" — Warren Buffett's description of derivatives in general — in large part because of their complexity. These securities greatly amplified losses from mortgage defaults as they allowed investors to take multiple side bets on the same underlying mortgage.

Neither CEOs nor regulators fully understood the degrees to which CDOs would amplify mortgage default losses or investors were on the hook for those potential losses. These unknowns deserved concentrated focus and attention. Analysts paid a great deal of attention to the sophisticated methods for measuring the visible components of the iceberg while failing to focus on what was under the surface—risk that was not being detected by the analytics at hand.

There is an innate tendency for people to overestimate the extent of their knowledge. In the business world, the psychological factors generating this illusion of knowledge are exacerbated by incentives for compensation and promotion. People expressing greater self-confidence are perceived by peers to have greater skills—regardless of actual ability. In many cases, risk analysts are more likely to improve their professional status by confidently providing answers rather than focusing attention on areas of ignorance.

Better risk measurement systems and greater disclosure of information make it more difficult for risks to fly under the radar, but there are always new gaps in information, requiring a dynamic approach to identifying areas of ignorance. Here are a few practical steps that organizations can take to better address areas of ignorance and in turn improve risk analytics:

  • Require that risk analysis include identification of the most significant and relevant unknowns and anomalies. This should not be restricted solely to insufficient or poor quality of data but should also include information about the workings of markets, such as who reaps the benefits from a transaction and who is exposed to the risks.
  • Reward those who are diligent and transparent in pointing to areas of ignorance. Discourage those who exaggerate what they know and do not welcome open and professional challenges to their ideas.
  • Distinguish between areas of ignorance that are actionable from those that are non-actionable. In this context, actionable refers to the potential for learning. For example, identifying that an important piece of information does not exist is worthwhile but not actionable. It becomes actionable only if the information can be created or if there is some useful proxy.
  • Develop a learning plan that focuses on areas of ignorance. Knowing what you don't know is useful, but awareness of ignorance is also a powerful engine for learning. This can be the most difficult step because it often requires breaking down silos within organizations. Learning often requires gaining new and different sets of skills and communicating with individuals who have different types of expertise.

Improving risk analytics and risk management is much more than a technical task. It is also requires overcoming overconfidence (and sometimes arrogance) and groupthink. It often requires cultural change that promotes diversity of thought, honest dialogue and breaking down organizational or professional silos. Developing an effective system that detects emerging risks includes focusing on what you don't know as much as what you do.
Dr. William W. Lang is a managing director with Promontory Financial Group and formerly head of supervision at the Federal Reserve Bank of Philadelphia where he led the Fed's model validation program for stress testing.

For reprint and licensing requests for this article, click here.
Law and regulation Compliance Mortgages Cyber security Dodd-Frank Consumer banking Bank technology
MORE FROM AMERICAN BANKER