Technology Advances Make Systems Decisions That Much Harder

U.S. commercial banks will spend nearly $18 billion on information technology this year. For an industry that is required to produce reams of financial reports, it is perhaps incongruous that there is no public information on where all this money goes.

From surveys, we know that banks invest about a quarter of their technology dollars on new technologies and applications. That $4.4 billion is typically aimed at increasing revenue or reducing expenses, either by improving existing systems or automating manual tasks.

To gauge the advantages of applying new systems, information technology decision makers theoretically use cost-benefit analyses. They take each business problem and weigh different solutions that could require changing operational procedures or investing in systems. But a cost-benefit analysis has some well-known limitations.

*The costs of the information technology solution are often indeterminable, especially when extensive development or new technologies are involved.

*One-time costs are considered, not life-cycle costs.

*The benefits, even if clearly defined, may not be quantifiable, especially when future revenue estimates depend on competitors' actions or when the benefits are somewhat qualitative.

*Those responsible for implementing the benefits are usually not those who make the cost-benefit calculations.

*Postmortem analysis to hone the organization's cost-benefit techniques is almost never done.

All these limitations are described in textbooks. The point is that bankers should be as accurate and complete as possible in estimating benefits. They must also work to improve techniques by adopting follow-up procedures.

But there are other reasons why cost-benefit analysis is difficult to achieve and apply. We call these hard, harder, and hardest.

*Hard Choices: Individual decisions are linked together and cannot be made in a vacuum.

Banks are expected to spend at least $2 billion to rewrite computer code to properly account for years beyond 2000. However, the cost for a bank is proportional to the amount of code it has, since each line of code must be searched for date fields and corrected wherever one is found. Thus, a bank with multiple application systems will incur more costs than a bank with a single system. That creates a dilemma: Should the bank regard this as an opportunity to replace those duplicate systems?

BankAmerica Corp., for example, uses a large proprietary deposit system in California; its Seafirst unit uses a Hogan deposit system; in other states, it outsources deposits to Marshall & Ilsley; and in other countries it is installing Fiserv's core accounting software. Chase Manhattan uses the Alltel deposit system for retail banking in New York and a different, proprietary deposit system for its commercial customers and its Texas Commerce affiliate. First Chicago NBD Corp. has three commercial loan systems.

Each of these banks may have sound reasons, such as organizational independence or unique features and functions, for supporting multiple applications systems. But making the year 2000 compliance decision will surely be harder when the extra costs of converting those multiple applications are considered. Chase, for example, is expected to combined 40 commercial lending systems into two, in part because of 2000 compliance.

*Harder Choices. Sometimes a new technology is a potential information technology solution looking for a problem.

New technologies, such as the smart card embedded with computer chips, don't arrive because of business problems for which they are the obvious solution. Rather, they emerge because of underlying progress in scientific knowledge.

Visualization technologies, for example, can create high-quality graphics from digital data. This is useful in making animated films - but can it be used in financial services? One potential use is to create four- dimensional arrays - that's three spatial dimensions plus color - from real-time spreadsheets. On a trading floor, each cell could define two currencies, the bar height could equal the foreign exchange spread between them, and the bar color could represent the current profitability of the position.

All major U.S. dealers are experimenting with this technology, and about 750 desks use the technology.

This is interesting because the normal relationship between problem and solution are reversed. The application is technology-driven and this is what makes making the decision harder.

*Hardest Choice. Trying to match a new technology with an old, intractable problem.

Some 40% of banks' noninterest expense base goes to the retail delivery channel. For decades, technology investments have tried to reduce costs or increase revenue. The result has been a profileration of retail channels, from branches and automated teller machines to home banking. Yet benefits are elusive, which brings into doubt the wisdom of the decisions. Each successive round of delivery channel investments has seemed to create more problems than it solved.

One reason is that each channel seems to merit its own systems, including separate channel-specific clients and servers, each with their own interfaces to the different mainframe or departmental core systems. Each channel has gotten out of sync with the others. So, a new information technology need has arrived - for an enterprise delivery architecture based on a wide area network and a messaging service that sits between the channels and the core applications.

A second "hardest" example is a problem presented by banks' general ledger systems. To transfer data for roll-up from specialized applications on second-tier general ledgers to holding company general ledgers, they may be relying on "SneakerNet." They may combine data with incongruent time periods, and these data may frequently conflict. Profit as measured by the general ledger may not equal profit from the profitability system because of timing and definition differences, backdating, or calculation differences. In short, the roll-up sequence isn't streamlined.

One solution would be to use a new client/server general ledger system. The benefits would be larger code block sizes, auditable reversal capability, and collection of nonmonetary data, such as head count, CUSIPs, or transaction volumes. Yet the decision is complicated because new general ledger systems may well be optimally placed behind, not in front of, a bank's data warehouse. If the bank doesn't have a data warehouse, it may need to rethink its entire decision support infrastructure, thereby complicating the decision enormously.

Finally, it pays not to invest in a cost-benefit analysis of the wrong problem. Information technology investments are sometimes driven by the marketplace, not by the individual bank. For example, in 1986 and again in 1993, mortgage-backed security issuance reached a cyclical peak. New technological capabilities came on the market at those times - such as Fannie Mae's systems to give lenders direct agency access for loan sale and delivery or private vendors' systems to manage interest rate risk. To maintain technological parity at those times, mortgage players had to move with the herd - whether or not they were prepared.

Information technology decisions are growing in importance but so is the difficulty of making those decisions. The traditional cost-benefit analysis is useful for easy choices, but care should be taken to cope with the limitations. For hard, harder, and hardest decisions, deep analysis and industry insight is required. Banks must stay educated about the impacts of information technology and must follow the trends. The new banking paradigm must be imagined, but realistically. Bank executives who make better information technology decisions will inevitably pull ahead of those who don't.

Diogo Teixeira is president of the Tower Group, Wellesley, Mass. Bill Bradway contributed to this article.

For reprint and licensing requests for this article, click here.
MORE FROM AMERICAN BANKER