Editor's note: A version of this post appears in the January issue of American Banker Magazine.
Between heightened regulatory scrutiny and privacy and security concerns, financial institutions are under more pressure than ever to properly manage their data troves. But many in the financial services sector have yet to master data governance.
The gap exacerbates the downside of information, making it look more like a liability than an asset. Under state and federal law, the exploitation of personally identifying information belonging to customers by unauthorized entities may result in significant fines and remediation costs, as well as reputational harm to the brand. Meanwhile, bad information (or errors of interpretation) can harm compliance and risk management.
And yet information is valuable, a source of competitive advantage that justifies substantial investment in the people and infrastructure needed to create, validate and interpret it. Enterprise data is not a commodity. There's no third party that can deliver superior replacement data if yours is worn out. Though it comes with big responsibilities, enterprise information is a strategic asset. But is it being managed as such? Is data clearly marked as to its sensitivity with usage policies to prevent privacy breaches before they happen? Is it sufficiently labeled to minimize the chances that a misreading gets fed into risk models?
It is no accident that data governance demands have strengthened at a time of growing popularity for online banking, mobile banking apps and digital-only banks. Today's banks are mainly information systems. And they're complex ones at that. Large banks typically have dozens of legacy systems, often product-oriented and poorly integrated. They are dependent on these systems even as they invest heavily in new technology, and so they continually add layers to an already complex infrastructure. When customer information is spread across systems, with little ability to create a unified view of any given relationship, seemingly simple questions such as "How many customers do we have?" and "How much business do they do with us?" are very hard to answer. Though the individual systems may perform well operationally and hold valuable data, their siloed nature makes them less useful as information assets, with implications for business agility, risk management and compliance.
All financial institutions face challenges to discovery, integration, access and interpretation of information, all of which is aggravated by the complexities of banking systems and regulatory compliance. What's the solution? The stock answer often involves a cry for investment in the data warehouse. But the landscape is always changing, so a data warehouse is never a complete mirror there's always critical operational data that's yet to be incorporated.
With or without a warehouse, what's critical (and in many cases missing) is metadata data about data. Metadata is the essential context that gives meaning to the numbers. Absent metadata, it is unclear, for example, if the "salary" field in the human resources system is in dollars or euros; an error in interpretation here could really throw a wrench into the annual planning process.
The need for better metadata is underscored by Basel III capital standards, which require new, complex models for operational risk, value-at-risk for capital requirements and counterparty valuation adjustments for derivatives. The wisdom of this approach has been questioned elsewhere, but what's not widely discussed is the critical importance of data access and quality. Sophisticated models fed bad data can deliver impressive but misleading results.
As a data consumer a risk manager, analyst, auditor, or an executive attesting to adherence to internal and regulatory policies you must insist on high-quality, accessible metadata.
Henry Olson is the director of product management at Embarcadero Technologies, which makes software tools for developers, database administrators and data architects.