Client server is brightest spot on the technological horizon.

Of all the new technologies available today, the client server is very likely to have the highest impact. This is because the client server is a paradigm, a way of doing things, and not just a new "technology." Paradigms tend to have more impact, because they affect what goes on around them and generally enable new businesses or strategies and disable - or eliminate - old ones.

However, paradigms can also be confusing. There is no exact definition of "client server" or universal agreement on using the phrase at all. (Concepts like "distributed computing" or "network computing" have just as much validity. Many professionals draw distinctions between these concepts, but again, there is no universal agreement.)

Client-server computing is a style of computing in which applications and data reside on multiple computers. There is no host, which in the traditional style of computing would contain the data, the application processing, and the presentation control, with the workstation or terminal responsible only for the display.

Instead, the host would now be called a server, and it would have only some of the data and some of the processing. The workstation or terminal would now be called the client and it would be responsible for the display, the presentation control, some of the processing, and some of the data.

This new paradigm has come about, like most things, because of the economics of hardware. Smaller computers are more cost effective than larger ones by at least two orders of magnitude, but they hold less than larger computers. So instead of cramming everything onto a large computer, where the volumes will fit, and suffer the cost disadvantages, the new paradigm is to break up the work so that it can fit on the smaller computers. Since the "work" consists of applications and data, breaking up the work means breaking up the applications and data.

It is worth noting that this is a fundamental, one-time paradigm shift. We are now in the very early stages of the total development of the computer. With the passage of time, today's mainframe-based architectures will look incredibly archaic, just as the early years of the industrial era look today.

Two hundred years ago, for example, machines were incredibly expensive and big. Early textile factories had thousands of belts running off one large power source just to reach different looms. Cable car technology was for a brief period exactly the same in many American cities: The cars had no power themselves but were pulled by miles of cables run off one huge power source. But the paradigm changed; motors became efficient and tiny, and today the average American home has numerous motors, or sources of power, in it.

It is not hard to see an analogous progression in banking. Microcomputers have penetrated the industry extensively already. Further penetration will occur, and the unit cost of micros will continue to drop faster than the unit cost of large machines. The costs of conversion, and the added infrastructural and overhead costs of the client-server paradigm are high but will not change much. Over time, the cost disadvantage of the large machine paradigm will grow relative to these conversion and new infrastructural costs.

So it is inevitable that the new paradigm will become the rule. The only questions are: * When? * What will be the sequence in which work will move from one paradigm to the next? * What parts of today's large machine infrastructure will continue to exist? * Will the nightly batch cycle exist forever? * What will be the impact on the business of banking?

Although the new paradigm is driven by ever-cheaper small computers, there are unlikely to be any net cost savings for banks. Justifying the new paradigm with expected budget reductions would be like chasing a will-o'-the-wisp. The fact is, client-server technology is extremely complicated. It is more difficult to build and support than the old paradigm. Moreover, many of the component technologies that must be pieced together are youngish and more risky.

It would be hard even starting from scratch to build a banking technology infrastructure based on the new paradigm. To change from the existing fully featured and highly functional mainframe-concentric paradigm, with which we are all familiar, while never missing a penny on anybody's account, is six times as hard.

Thus, it's no wonder that progress has been slow. For example, most new client-server implementations, in fact, run on LANS, with a desktop machine serving as the client and a powerful workstation or minicomputer serving as the server.

Yet, according to last year's American Bankerlernst & Young survey, only 4% of the industry have even one relatively high volume monetary processing application with a component that runs on a LAN.

Thus, if banks' key applications, which do involve monetary transactions, aren't trusted on LANS, they generally don't run on client-server architecture either.

Another issue intimately related to the new paradigm is open systems. Again, while this phrase has no precise definition, it generally means that each of the many different pieces of the infrastructure, both hardware and software, can be bought from different vendors and still work together. When all the work is done on one box by one system, only one vendor, with a proprietary approach, may prevail. But when the work is broken up, many different pieces, bought from many different sources, must all work together.

The older, proprietary operating systems (like IBM's mainframe operating system, MVS), which most of the banking industry is currently using, are not "open." A more acceptable definition of an open operating system is Unix.

In my consulting practice, I have found that a very small percentage of today's banking mainframe applications are Unix based. (An estimated less than 10% of the minicomputer applications are Unix based.) While some growth in this statistic is expected, the message is that open systems, a necessary corollary to client-server architectures, are not here yet.

Two interesting questions for bankers: Will the highest-volume transaction processing systems will ever convert to the new paradigm? Will the nightly batch cycle ever be changed? They probably won't be answered in the foreseeable future.

Almost by definition, high volumes dictate use of a large computer. Desktops can't do the job, even if they do have a 50 MIPS rating (more than many mainframes). They are "I-0 bound," meaning they can't get data in and out of the data base fast enough to handle the volumes. In essence, having a high transaction volume becomes a business requirement that dictates a computer paradigm of higher-than-minimal cost.

Can these high-volume transaction systems be broken up into many smaller transaction systems? The answer again is, probably not. Like it or not, the costs of computing platforms are only a small part of the overall costs of banking and financial transaction processing - less than 3% at best. Financial, operational, marketing, and risk costs dominate the business of banking. And there is ample evidence that economies of scale do pertain in these areas.

Processing millions of checking accounts per night, hundreds of thousands of automated clearing house transactions, or tens of thousands of money transfers is always going to be done in bulk because of the economics of collecting and controlling the transactions, the advantages in controlling the customers who create those transactions, and managing the associated financial and credit risk. In truth, it's more like the opposite: industry consolidation will create ever larger customer groups, data bases, and even higher volume transaction systems.

In conclusion, major change is always slow, and the banking industry's existing technological infrastructure is no exception. But in those banking areas where new investment is being made, where the cutting edge is, the new paradigm is fully in evidence.

The bottom line is: Employ the new paradigm aggressively on the margin. Bet on the declining costs of this paradigm. But don't go too far ahead. Keep the old paradigm alive and well maintained for the rest of the decade - at least.

For reprint and licensing requests for this article, click here.
MORE FROM AMERICAN BANKER