Technology gurus have predicted for years that networks of personal computers eventually will replace mainframe and midrange machines as the dominant computing engine in corporate America.
In banking, this transition is slowly but surely taking hold. And even bankers who work outside data processing areas are getting excited about it.
"This is the direction of the business, and the way of the future," said William H. Berls, a group vice president in Chemical Banking Corp.'s corporate trust unit.
The 1993 American Banker/Ernst & Young Survey of Technology in Banking illustrates that the transition to distributed computing is under way.
Among other findings, 4% of the "on-line" production software in the industry runs on so-called distributed computing systems in which networks of PCs work in tandem with larger machines.
In three years, bankers expect to run 12% of their business-specific "production" software on distributed networks. In that time, bankers also expect to boost from 6% to 13% the proportion of production software running on PCs.
By contrast, 89% of the banking-specific software in the industry now runs on mainframe or midrange computers, down from 92% last year. The total is expected to drop to 75% by 1996.
Mainframe and midrange computers dominate mainly because they have been around longer than PCs. Also, they are traditionally the only machines powerful enough to crunch the tomes of data generated by such bank products as checking accounts, consumer loans, credit cards, and automated teller machines.
But PCs are growing more powerful, and, when hooked into networks, can rival the much larger machines for much less cost.
"At the highest level, the [price/performance] benefit of PCs over mainframes is literally 10 to 1," said Todd Rulon-Miller, chief executive of Software Alliance Corp., Berkeley, Calif.
PCs are also benefiting from a whole new class of easy-to-program software that lets programmers design software applications in weeks, rather than the months or years it can take using mainframes.
The result: Many bankers, including Citicorp chairman John Reed, think the time has come to start shifting to distributed computing.
In a speech last month at a computer conference in Boston, Mr. Reed glumly spoke of flat revenue growth and a global economy "in a refrigerator." He perked up, though, when describing how Citicorp can boost profits by "reengineering" the way people work using "on-line, distributed technology."
These "modern systems" could eventually cut Citicorp's annual operating costs in half, Mr. Reed said.
And he isn't alone. Now, nearly 60% of the banks in the country are either testing or running at least one major production application on a distributed computing network, according to the survey. In three years, 96% of the country's banks are expected to run at least one production application on a distributed network.
These ambitions are affecting bank purchasing plans.
There are now 805,000 PCs in banks, according to the survey. Bankers expect to have more than 1.2 million PCs by 1996. By contrast, there are 2,920 mainframes, a total expected to drop to 2,650 in three years. Similarly, there are 2,220 midrange computers, a total expected to drop to 1,860 over the same period.
In terms of computing power, PCs already dominate. One standard measure of this is based on the millions of instructions a machine can execute in a second, or MIPS.
There are now 64,000 mainframe MIPS in banks, 10,000 midrange computer MIPS, and five million personal computer MIPS, according to the study.
But even though PCs have the bulk of computing power, mainframes store most of the data. Banks now have the ability to store 2,920 terabytes of data. That's roughly two million pieces of information - such as a Social Security number or street address - for each man, woman, and child in the United States.
Ninety-six percent of this data storage capacity is on mainframes, the study indicates.
Distributed networks let banks leverage the best features of each type of computer by making it possible to use PCs to retrieve data over a network from central mainframe repositories.
Among the banks building such computing systems is Chase Manhattan Corp., in New York.
"We've been extremely active for about 14 months" with a brand of distributed computing called client-server," said Eugene Friedman, a Chase vice president.
Client-server is a type of computer system where so-called PC clients run software that retrieves data from powerful server computers over a network.
The servers can be powerful PCs, midrange computers, or mainframes.
Virtually every division in Chase has at least one client-server application, Mr. Friedman said. One of the most ambitious client-server applications was installed last year in the bank's credit card unit.
This software, which Chase designed itself, runs on networks of personal computers used by about 700 customer service representatives in telephone centers in Arizona, Delaware, Florida, and Long Island, N.Y.
Representatives punch buttons on their personal computers, and, over a network, get account information from mainframe computers on Long Island.
The PCs display the information on easy-to-read screens, and speed calls by "queuing" up data in anticipation of the next question a customer will ask.
Mr. Friedman said a boost in worker productivity and a reduction in data-entry errors will pay for the multimillion-dollar computer system in about 20 months.
To accelerate the use of client-server applications, Chase has also created a committee of computer specialists to set standards for the type of client-server technology that divisions should buy or design.
Banks around the country have formed similar committees. And this standard-setting activity is a potential source of controversy.
On the one hand, standards are needed to ensure that bankers buy good computer systems, at a fair price, and don't expose the bank to security risks.
But the problem is that setting standards impinges on the freedom that departments have traditionally enjoyed to choose their own PCs and PC software.
"There's a conflict because users want more control over their destiny," said Jeffrey Rocchio, a computer planning analyst at National City Corp., Cleveland. Mr. Rocchio works on a client-server standards committee recently established by the bank.
Most people, including bankers, use PCs to run so-called personal productivity software, like spreadsheets and word processing applications. Many people also play computer games.
But as client-server banking software starts to run on these PCs, technologists are wondering if they will have to tell users not to run their favorite personal software.
The problem is that personal software could eat up so much memory on a PC that the banking applications can't run. Or the personal software could have strange attributes that interfere with banking applications.
There is also the danger that users might bring in software with computer viruses, and infect an entire bank.
So far, banks like National City haven't clamped down on personal software. But this could change.
"There are differences of opinion over how far we should go with standardization," Mr. Rocchio said.
The distributed software that banks are installing usually does not replace mainframe or midrange computer software.
This change, called downsizing has only happened for 300 mainframe applications over the past three years, according to the study. There are now more than 200,000 mainframe banking applications in the country.
In the next three years, another 1,100 mainframe applications are expected to be downsized.
Instead, most of the new, distributed computing software is either giving bank employees and customers better access to data on mainframe or midrange computers, or automating tasks that have never before been computerized.
In fact, there are very few core banking applications that run only on PCs. Instead, almost all of the core banking software in the industry runs on mainframe or midrange computers, with PC access available as an option.
But, over time, this could change, as vendors try to take greater advantage of the cheap computing power available on PCs.
"I think it's a matter of when, rather than whether," said Charles Kight, executive vice president at Michigan National Corp., Farmington Hills.
One vendor that is trying to be a leader in the shift is Savings and Loan Data Corp., Cincinnati.
Since 1978, this not-for-profit service bureau has used mainframe computers to run core banking software for its owners - some 170 thrifts and banks, based mainly in the Midwest.
Last year, the company, which goes by the name S&L Data, decided to diversify by buying a struggling start-up vendor that had built a core accounting, general ledger, and teller software package, called Anthem, for PC networks.
Jack Kuntz, an S&L corporate vice president, said S&L bought the company to prepare for "platform migration" of computing operations from mainframes to PC network
"That's the wave of the future," Mr. Kuntz said. "That's where the leaps are going to be made in technology."
Six banks, each with less than $100 million of assets, now use Anthem to run their core banking operations.
One customer is Founders Bank of Arizona, Scottsdale, which bought the Anthem software in 1991. "We spent a lot of time nursing it along," conceded Jan McDowell, senior vice president and cashier.
But despite the hassles of dealing with "bleeding edge" technology, Ms. McDowell said the bank has been happy with its move. The software cut data processing costs 24%.
The articles in this publication analyze and report the results of the American Banker/Ernst & Young 1993 Survey of Technology in Banking, conducted by the Tower Group. The survey, containing many detailed questions on banking technology, was delivered to the chief technology executives at every bank holding company in the United States with over $1 billion in assets (275 banks) and to the chief executive officers at 1,200 selected bank holding companies with under $1 billion in assets.
One hundred and seventy-four surveys were returned, allowing the survey to incorporate data from just over 50% of the industry, as measured by noninterest expense. Noninterest expense was chosen as a measurement because, of all publicly known figures, it best correlates with technology spending.
Ninety-six of the respondents were from banks with over $1 billion in assets, including more than 70% of the top 25 banking companies and over 50% of the top 100.
In calculating the industry projections, each bank's responses were weighted according to the percentage of the industry, as measured by noninterest expense, that they represent. Both quantitative and qualitative responses were used to develop industry projections.
The weighting ensures that, for example, if a large money-center bank were strongly committed to imaging, it would be considered proportionately more significant for establishing an industry trend than a similar response from a smaller bank.
All data represent estimates for the U.S. commercial banking industry as a whole. For a large variety of questions on this survey, the statistical validity of the results will vary based on the number of respondents who answered that particular question, the degree to which their interpretation of the question was consistent with others, the structure of the particular question, and the degree to which the distribution of the respondents answering the question is representative of the distribution of the population at large.