The need for banks to invest in information technology is often cited as a factor in the restructuring and consolidation now sweeping the industry.
Systems are expensive, of course, and larger banks are looking to gain economies of scale in their back offices, a fact that has indeed contributed to the recent tide of mergers and acquisitions and reengineering projects.
But while the industry's overall information infrastructure has definitely changed in the last five years, it hasn't changed as fast as it would have if banks were taking full advantage of technological advances.
This year's American Banker/ Tower Group Technology in Banking Survey offers several examples of how the installed base of bank systems has slipped behind the state of the art:
- Industry spending on new technologies - defined as groupware, imaging, data warehousing, and client/server - was $1.2 billion in 1995 (7.1% of all technology spending), but was predicted to rise to $2.3 billion, or 11%, by 1998.
- Only about 67% of all bank employees have personal computers at their desks.
- About 35% of these client PCs are DOS-based and lack a graphical user interface like Microsoft Corp.'s Windows. Even some 2% to 3% of this year's new PC purchases are predicted to be DOS only machines.
Indeed, over the past five years, the banking industry has delayed and/or deferred a significant amount of needed technology investments. This occurred for several reasons:
- The strong downward pressures on industry profits in the 1990-92 period.
- The desire to rebuild capital after profitability was restored in 1993.
- Banks' desires to cut costs and reduce operating ratios, which have typically affected data processing as much as any other part of the bank. Yet more investments in information technology might have had significant potential to reduce costs elsewhere in the bank.
The net result has been that the industry's total technology spending growth rate over the last five years - 6% to 7% - has been barely above inflation. Moreover, this growth rate is not predicted by survey respondents to increase at all by 1998.
It's worth noting that banks have especially fallen behind in what we might loosely call the client/server world - the complex of desktops, servers, software, and data bases that exist outside banks' older mainframe-based architectures. We estimate that about $5 billion of the $17 billion banks spend on technology goes into this world and that growth is about 12% - soaking up all of the industry's incremental net growth in systems spending.
Despite the higher growth rate, banks generally still lack state- of-the-art distributed computing systems for several reasons. Client/server technology is much newer and was just beginning to get a head of steam when the industry's financial crunch began in 1990. More often than not, client/server computing is used for new functions or systems - the easiest efforts to defer or cancel. Survey respondents emphatically believe client/server doesn't yet have mainframe-quality robustness and reliability and won't for at least several more years. There is sometimes greater resistance from the business side. And finally, more training may be needed.
Still, the current wave of restructuring reflects a renewed recognition of the risks of not keeping up with client/server technology. Why else would some big-bank chief executives say that part of the reason for their big merger was to enable them to keep up with technology? Restructuring banks aren't going to increase their investments in older, mainframe-based technologies and legacy systems. On the contrary, most merging banks are counting on data center closings and legacy system rationalizations.
Adding to the restructuring imperative is the unique way that client/server economics works. Much more of the total costs of each client/server system occur outside the data center because of the cost of equipping each desktop and/or worker with suitable computing power and upgrading networks.
Traditionally, the industry has thought of desktop computing as cheaper than mainframe computing. The theory goes: If only we could get all our processing on a micro, look at all the money we could save.
But the reality is that PCs are not replacing mainframes at all. Banks' client/server investments simply add more infrastructure to today's data centers and legacy systems.
We envision that in the future all bank employees will work from a PC suitably networked to a variety of client/server applications that, in turn, access centralized legacy data and systems.
Recently, a bank we know bought a used but powerful IBM mainframe for only $100,000. Yet the bank's PC budget for this year was $7.5 million. This represents a complete reversal of the historic information technology equation in which the cost of centralized hardware was dominant. In fact, many of today's critical core systems were engineered a generation ago (in the 1960s and '70s) specifically to conserve mainframe processing time.
The implication is that, in the future, each bank's total employment will drive up its client/server costs much more than its mainframe computing costs. (Each incremental employee may cost $5,000 to $10,000 for a fully allocated PC.)
Adding to the client/server expense are the many new skills needed. Banks need Sybase and Oracle experts, yet market forces and high demand make such people twice as expensive. Banks need visual C++ programmers, yet experienced Cobol programmers are tough to retrain. Data base administrators are essential, yet cost an arm and a leg. At the same time, existing legacy systems can't be abandoned. So the new costs do not displace other costs but add to them.
The bottom line is that banks need to restructure to free up resources for client/server and other new technologies. Restructuring can do that by reducing the mainframe and legacy system investments, by reducing employment, and by creating banks with deeper pockets that can afford the costs of new infrastructure and new skills.
It is paradoxical that the mergers needed to create scale can also defer reinvestment simply because data processing departments must devote time and energy to conversions. Focusing on data center consolidations and selecting survivor applications can take from six months to a year after a major merger. During this period, major new initiatives may have to be deferred.
But such transitions periods do end. Once restructuring is complete, more scope and scale will encourage renewed efforts by banks to:
- Define application architectures.
- Reduce unneeded customization of applications.
- Standardize applications and data across geographic areas and/or subsidiaries.
- Upgrade seriously deficient applications, especially older proprietary ones.
- Build data warehouses to augment transaction systems.
- Make delivery channels more effective, especially in areas like branch automation where in some cases investments have been deferred since the early 1980s.
- Control risk better.
- Manage customer data better and more effectively.
- Do more to eliminate paper from operations and customer channels.
- Plan better for client/server systems, such as having a Windows migration plan
Once restructuring is completed and technology reinvestments have caught up, we can expect the industry's gap with state-of-the-art systems to shrink. Banks' need for information - and for the technology to create and control that information - is never going to diminish. The technology infrastructure that is so expensive is literally the means of production for the bank. The banks that don't heed these changes will never be able to catch up. The banks that do will have the best chance to enjoy the opportunities that information technology can provide.
Mr. Teixeira is president of the Tower Group, a consulting firm in Wellesley, Mass.