The Sarbanes-Oxley Act has survived its recent Supreme Court challenge and financial institutions have returned to the task of researching, reconciling and reporting on financial data. As the end of the quarter approaches, analysts will work around the clock to ensure the accuracy and completeness of corporate reports.
While the rules have remained the same, however, the mind-set toward financial data quality has changed in recent months as financial executives are now adopting many of the best practices developed by customer data stewards to streamline and automate the process.
Terms like validation, business rules, data normalization, jurisdiction assignment and governance are coming into vogue as new technologies emerge that can help institutions overcome many of the challenges associated with financial data quality.
Clearly, financial data is different from other data assets such as customer addresses. Financial data reports on the health of a company and must comply with numerous regulatory requirements. In addition to P&L statements, this information drives tax payments, agreements, royalties — and ultimately, investor confidence. Unlike character-based records, tools like "spell check" and data parsing offer little assistance; currency conversion can complicate matters; and speed is critical given the time-sensitive nature of financial reporting.
Managers must also deal with the same challenges associated with customer data. The accessibility of accurate data is hampered by the fact that records are stored on disparate systems. It is not uncommon for different divisions or subsidiaries to rely on different platforms (or even desktop Excel files). Inconsistencies in definitions and labels are likely when enterprise resource planning, order entry and customer relationship management systems are compared, and the same company may refer to the same product using product codes, part numbers, brand names or abbreviations. Bundled offers may include both goods and services. And at the end of the day, financial executives must still find ways to allocate costs and revenue with total integrity.
For bankers, these issues affect not only their own firm, but commercial clients as well. A Gartner survey of senior finance managers reports that "the most critical technology issue was the need to improve data quality to enhance the accuracy and consistency of financial reporting."
Ultimately, one's ability to streamline and automate the quality of financial data begins with an understanding that financial data is just another data domain. Data quality methods that have proven to be cost-efficient and effective in managing customer data are often applicable to financial data — once you can deal with the unique attributes of numeric data. Fortunately, these capabilities exist today, and financial institutions can make significant gains by adhering to these five best practices in financial data quality.
Control totals validation: Bank statements provide control totals in the statement or account header that make it easy for consumers to "error check" their statement. These same types of controls should be incorporated into an automated data quality check on enterprise-level financial reporting. Data quality platforms that were once limited to character-based functions can now handle the complex numeric calculations needed to check and validate these computations.
Business rules: It may be easy to identify errors in customer data, such as when letters are transposed in a name or address — but numeric errors (e.g., 123 instead of 213) are less obvious. That's why today's leading financial data quality tools incorporate business rules into their logic that compare actual totals to expected ranges and flag items that may be incorrect. When a service that normally sells for $5,000 reflects $50,000 in revenue, that may be a sign of data entry error, miscoding or a bundled offer.
Normalizing product definitions: Traditional data quality disciplines can provide immediate gains in dealing with inconsistent names and labels. A product may be listed by a product code in one system and a brand name in another. At times it may be sold as a stand-alone offer, other times as part of a bundle. Most high-end data quality tools can normalize these labels and provide for consistent product definitions that make it easy to aggregate financial data across platforms, divisions and geographies.
Tax calculations: Many industries are required to assess and remit sales and use, property and payroll taxes, and rates are often based on local tax jurisdictions that cannot be determined by simple ZIP code lookups. By assigning jurisdictions at the rooftop level, finance departments can automate tax compliance.
As data quality is a process, organizations should also set in place guidelines as to how financial data quality will be managed. What issues or gaps exist today? Who can or should develop rules? Who can administer changes to rules?
In today's market, investors and regulators continue to shine the spotlight on compliance and data quality.
While finance departments can continue to rely on redundant processes and excessive man-hours to track down data and reconcile reports, there are ways to reduce operating costs and automate these functions. With today's technologies, finance executives can more easily adapt traditional data quality methods to financial data.