EXECUTIVE VIEW: Technology Investments Have Led To Excess Banking

The traditional measure of a bank's capacity is the loans that can be handled based on capital and on deposit levels. When loan demand is low, the excess funding capacity goes into securities. When loan demand is high, banks buy large Eurodollar deposits in the overnight market. In banks' role as financial intermediators, this approach is timeless.

But the balance sheet is no longer a complete explanation of a bank's capacity. Banks are turning into transaction and information processing institutions. Much of the value they provide to their customers is no longer a part of the balance sheet. This new value is measured in different ways such as accounts, transactions, accuracy, or quantity of information processed.

Take today's modern transaction account. The value provided to the customer isn't measured by the balances or the interest generated. Rather, corporations are interested in the speed of payment delivery, the accuracy of the transaction, controls over credit risk, associated payment information, and, of course, low costs. Consumers are interested in convenience, comprehensive reporting, accuracy, and, again, low cost.

Yet, bank and industry statistics only cover deposit levels. Rarely do government banking statistics mention the number or type of accounts or transactions processed, the amount of backup information, the volume and accuracy of statements, etc. Of course, some banks know this. They monitor how many checking, savings, club, or Christmas accounts they have. They know average transactions per account; and possibly even cost, revenue, and profit per type of account. But, they don't reveal the information publicly and, as a consequence, there is no reliable industry data. How many checking accounts are consumed annually in the United States? Of those, what fraction have a minimum balance requirement? How many new passbook savings accounts were opened last year in California? The basic answer to all these questions is: Guess.

Were such data available, it would greatly help banks to understand and value their growing investments in information processing. But even that wouldn't be all we need to know. Output or current volumes needs to be measured against capacity, defined as the level of accounts, transactions, or other information processing activity units that a single bank, or the industry, could produce without any additional investment.

Capacity is widely followed in other industries. Airlines watch revenue seat miles and load factors very closely. Freeways can handle up to 2,000 vehicles per lane per hour in peak periods. The Organization of Petroleum Exporting Countries knows that its capacity to increase crude oil output by 5 million barrels a day gives it political power. When the world tanker fleet has overcapacity, prices drop precipitously and every ship owner suffers. Unused industrial and manufacturing capacity in the U.S helps control inflation by keeping down wages. The relative degree of undercapacity or overcapacity in an industry to produce a particular good or service influences the investment decision of individual competitors. But in banking, such nonfinancial analysis is rarely possible.

There is every reason to believe that banking suffers from severe overcapacity. Some say it's the number of banks, but technology, back offices, and branches produce bank products, not corporate organizations. Consolidation may be part of the answer, but the number of banks itself is not the measure of the overcapacity. Others point to the number of branches, yet brick and mortar is increasingly just one of several retail delivery channels. Reducing the number of branches won't reduce capacity at all if every teller is simply transferred to a telephone service center.

How should we measure a bank's capacity to produce information-based products? It's certainly not raw data processing statistics. It isn't enough to say that with a data center processing 600 MIPS (millions of instructions per second), 20,000 PCs on employees' desks, or a new application package, a bank could produce x number of mortgages or funds transfers. Indeed, a complex formula would probably be needed for each separate product group. Retail transaction accounts, for example, might be considered one group and the formula might include:

Number of branches;

Square footage in all branches;

Number of owned ATMs;

Total person-hours per year of telephone customer service;

Source, version, and features of DDA core system. (For large banks, the allowable answers here would be a proprietary system from Hogan, Systematics, M&I, Infopoint, or another vendor.)

The data center's MIPS and direct access storage devices *capacity.

Number of, and throughput of, item processing equipment such as reader-sorters, encoding machines, etc.

In theory, we could estimate the parameters of a bank information processing related capacity model by taking the appropriate measurements at enough banks. We would run tests at each bank by adding more transaction accounts to the existing volume until costs, quality, or other measures got worse, thereby showing that capacity had been reached.

Were such industrywide capacity measurements available, banks could better assess and value their technology and business investments. An investment in a line of business with a lot of overcapacity might then be deferred and replaced by an investment in a business line with unsatisfied demand.

Our hypothetical capacity model might turn out to have three parts.

The power to maintain and update data requires mainframe capacity, data storage, and some ancillary computer devices and systems software.

The ability to manipulate the data requires applications programs.

The ability to gather transaction data, sell the product in the first place, and service it on an ongoing basis requires a variety of channels to reach customers and/or sources of transactions.

The important point is that technological trends are constantly working to grow this capacity. The installed base of MIPS and direct access storage devices goes up every year. Mainframe-oriented data center statistics grow at 20-25% each year, and distributed systems grow even faster. The application programs per se usually have throughput constraints governed by their underlying design. But these throughput constraints are generally occurring at higher and higher levels. Moreover, the application programs are being constantly enhanced with new features and functions.

Finally, the transaction gathering, sales, and servicing channels have proliferated. In retail banking, deposit-gathering influences include: the growth of ATMs in the 1970s and 1980s; improvements in branch automation; growth in recurring automated payments (the automated clearing house); and the growing interest in home banking channels.

The upshot is that the industry has significant, but unmeasured, overcapacity. Banks' total information processing capacity has gone up by, let's say, 10,000% over the past 20 years far, far more than any increases in the business itself. The only known effort to quantify this situation was the 1992 Ernst & Young/American Banker Technology Survey, which estimated industry retail demand deposit account capacity at 224 million accounts and production at 157 million accounts. This meant about 30% unused capacity for that line of business. Similar calculations were performed for other lines of business, but the survey results needed validation, a better theory, and a more rigorous data base. They also needed to become part of a continuous effort aimed at producing new data every year.

Of course, individual banks make new capacity investment decisions all the time. For example, if a regional bank with 100,000 mortgage accounts buys a portfolio of 300,000 additional mortgages, it knows quite well it will have to expand processing capacity. It will almost certainly have to add central computing resources and direct access storage device capacity, it may have to upgrade/replace its application system, and it most likely will have to increase the capacity of its transaction receipt channel, i.e., the retail lockbox. But it will probably make all these decisions in complete ignorance of total industry mortgage processing capacity.

In some lines of business, consolidation has proceeded a long way, industry capacity is more visible, and competitors take the appropriate action. Bank of New York Co.'s recent acquisition of BankAmerica Corp.'s American depositary receipt processing business is an example. Bank of New York, the market leader, has the systems to support incremental processing business. As BankAmerica sold out, its older ADR processing capacity left the industry and benefited all ADR processors.

Now is the time for banks to begin measuring and thinking about industry capacity from an information processing point of view. Such thinking will not, and should not, replace balance sheet and loan demand measurements. But it can augment traditional data to give a more balanced view. Future information technology investments will someday make today's spending look anemic and will make production capacity far more important. Someday, banks will be more cognizant of technology's effects on their bottom line. Those banks hoping to dominate in the future should begin their capacity calculations sooner rather then later.

For reprint and licensing requests for this article, click here.
MORE FROM AMERICAN BANKER