Parallel processing looks increasingly attractive.

Parallel processing, conventionally held out as a quick and efficient problem solver for the scientific and government community, has started to take firm hold within the financial world.

These computing systems descended from the very powerful and very expensive supercomputers - are composed of many microprocessors ganged together in loosely coupled clusters of several units each. The processors feed off the same pool of computer data, but due to their special architecture, the processors can work independently, thereby attacking tasks or queries from several angles at the same time.

For large financial institutions, these machines represent greater opportunities to process either informatibn or transactions faster and with greater accuracy. The results, said to bankers who are using these systems, have yielded more finely tuned marketing campaigns, better analysis, and greater processing capacity and speed.

"Parallel processing in dataintensive areas is going to make a sweeping impact on the financial industry," said Howard Richmond, vice president for high-performance computing at the Gartner Group Inc. in Stamford, Conn. "The large retail establishments, like WalMart, really led the charge into parallel processing; no other system could have allowed them to grow like they did."

Unlike supercomputers many makers of which have been foundering lately - parallel processing systems are cheaper to develop and more flexible, since computing power can be increased by adding more processors.

"Massively" parallel processing systems - the likes of which have long been in place at BankAmerica Corp., Chemical Banking Corp., and Citicorp have a greater capacity to grow than the closely related symmetrical multiprocessors, which are even cheaper and easier to configure, but can only operate successfully up to a limited number of processors.

Decreasing costs and growing computing power have recently spurred more and more banks to embrace, or at least to investigate, this technology. The price of massively parallel processing systems has dropped an average of 85% over the past decade, as their power - gauged in MIPS, for millions of instructions per second - has doubled every two years, according to Mr. Richmond.

"Historically, we've run into a wall with the price performance of these systems," said Craig Goldman, chief information officer for Chase Manhattan Corp. "But today, the combination of better technology and price performance have made it aviable alternative."

Chase became one of the latest converts this summer when it began installing a massively parallel processing system for marketing in its credit card division.

AT&T Global Information Solutions, formerly NCR Corp., the system's vendor, courted Chase for two years before executives decided to stake several million dollars on the 128-processor system.

Based on initial tests, the parallel processing system has been able to process data for customer segmentation and analysis in one-seventh the time it took for Amdahl Corp. mainframe computers, according to Jonathan Vaughan, Chase's vice president for applied technology.

"The way we're going to differentiate ourselves in the future is by exploiting and mining our own data assets," Mr. Goldman said.

In fact, the bank already plans to roll it out into as many as a dozen other areas, starting as soon as early 1995, Mr. Goldman added.

Although they have become attractive to most financial institutions only fairly recently, the massively parallel processing architecture has existed for several years. In fact, a handful of institutions, such as BankAmerica Corp., years ago adopted earlier generations of this parallel processing hardware from Terndata Corp., which NCR later acquired.

BankAmerica has employed massively parallel processing for eight years in many segments of its business. Beginning in 1986, the San Francisco-based holding company started using the system in its credit card area to improve marketing potential. Over the next few years, the bank steadily rolled the technology out to its consumer lending area, and into its retail bank for marketing, then to help evaluate risk in its credit and portfolio management area, and most recently, into its finance division for acquisition analysis.

Now, the West Coast money center runs a 162-unit system containing more than 600 gigabytes of storage, and it handles much more than simple queries: Officials say that it has been essential to analyzing every potential acquisition for the past three years.

"This whole technology has businesswide support," said Charles W. Griffith, the systems director and vice president for retail MIS at the bank.

By mining the bank's vast volumes of data, the system has been able to provide sound decision support more quickly and accurately than other available technology, he said.

BankAmerica can obtain a snapshot of its customer relationships within four nays, Mr. Griffith said, as opposed to the two months it used to take to run on the mainframe:

Beyond the data warehousing concept, executives use the system to analyze the potential fallout from hypothetical situations - for example, the closing of a nearby military base or the impact of changed interstate banking regulations.

But some banks and vendors also see applications for parallel processing beyond the realm of decision support.

"Five years ago, parallel processing meant one thing, today it means something totally different," according to Philip G. Hensley, the vice chairman of First Bank System.

As one of the pilot sites for International Business Machines Corp.'s new parallel transaction server, the S/390, First Bank System has been running on-line transactions, except for its batch funds, through this system since the beginning of this year.

Transaction processing on a parallel platform costs one-third of processing on the traditional mainframe, said Mr. Heasley.

The Minneapolis-based bank holding company plans soon to move its trust processing business to the parallel server.

Mr. Heasley, who sits on the board of Cray Research, the supercomputer maker, believes this computing will reengmeer the entire processing business cutting into the mid-range computer market.

But not everyone sees core systems moving to a parallel universe.

"We view massively parallel processing as an application for a very narrow focus - in decision support for users with very large data bases," said Brian Richardson, a senior research analyst for open computing and server strategies at Meta Group Inc., a consulting firm in Westport, Conn.

Mr. Richardson believes that many companies expect this computing technology to become a panacea for all their processing woes. He questioned both the expensive and elaborate structure of such devices and the viability of some of their more established vendors.

Mr. Richardson, who has been a consultant to more than 15 financial institutions, said that he usually recommends symmetrical multiprocessing-systems for companies that want 500 gigabytes or less. Although these systems have a more limited capacity for growth, Mr. Richardson said that the symmetrical multiprocessors have greater database functionality. Massively parallel systems, he said, are limited to two proprietary data bases AT&T-GIS's Teradata or Tandem's nonstop SQL.

But Mr. Richmond from the Gartner Group believes the financial industry will indeed account for a large chunk of burgeoning commercial parallel processing market.

"I think you'll find this [technology] proliferate rapidly in 1995, at regionals as well as money centers," Mr. Richmond said. "This industry is alive and well and very happy."

For reprint and licensing requests for this article, click here.
MORE FROM AMERICAN BANKER