Laggards Apart, Banks Get Ready For Computer Code Glitch in 2000

Many banks are finally beginning to confront the so-called year 2000 problem.

But surprisingly, some are still ignoring it. And even for those making changes, the task often is turning out to be more strenuous than anticipated.

As most bankers know by now, the "2000 problem" involves billions of lines of computer code that use two digits to represent the year in a date; 1990 is 90, 1956 is 56.

But 2000 is represented as 00, which will be read as 1900 by some systems and as nothing at all by others.

Computer programs representing years by two digits in calculations - such as those measuring accrued interest - are bound for trouble if the code is not modified.

For banks, the revision will be huge and expensive.

According to the Tower Group, a consulting and research firm in Newton, Mass., U.S. commercial banks have about 9.3 billion lines of computer code. An estimated 4% of these, or 370 million lines, have a date function that needs modification. Total cost to fix the problem during the next four years: more than $7 billion.

Though modifying code usually is simple, finding the lines that need to be changed is time-consuming and mind-numbing.

And as the industry draws closer to an immovable deadline, prioritizing year 2000 projects has become a huge issue for an increasing number of banks.

"Up until a few months ago, just about everybody working on the year 2000 problem was operating on the premise that it was all about fixing programs, fixing code," said Joel S. Goldhammer, vice president at A.T. Kearney, the Chicago consulting division of Electronic Data Systems Corp.

But now, he said, many realize the problem is not so much technical as an enterprisewide business effort that requires a lot of attention.

"At any company that has independent business units, the units will have to fix their own code, but the risk of failure of any of these units exists at the enterprise level," Mr. Goldhammer said.

Accordingly, the most important aspect of planning and implementation for 2000 now involves determining the risk posed by the potential failure of specific business units.

The problem exposes banks to several kinds of risk, Mr. Goldhammer said.

"What we do is help clients to identify the four or five areas of major risks - like financial, regulatory, image, in some cases fiduciary - and we help them measure, application by application, how they're doing against that enterprisewide business risk," he said.

One of the tools used by A.T. Kearney is a "skyscraper chart," which is a three-dimensional graph of the risk to the enterprise posed by various business units. The risk is based on the complexity and size of the applications used by each unit.

A number of vendors are engaged in similar efforts to prepare for the new millennium at banks.

Like Mr. Goldhammer, many are seeing changes in the way banks approach the problem.

Some institutions are using the occasion to reassess their applications software to see whether it is worth fixing. For banks with turnkey software, replacing old applications with a new package that recognizes 2000 can bring new functions while helping the bank avoid the cost of fixing date references.

"The problem with fixing code is that when you're done with all the work and expense, you still only have what you had," said Collins Andrews, president of U.S. commercial banking at Alltel Information Services Inc.

"I would say that 50% of responses from the various business units are that they don't have to worry about the code for year 2000 because they're going to replace the system," said Mr. Goldhammer, whose company works on the problem with Citicorp and a number of nonbanks.

But others noted that replacing software systems often is not the easy option it may appear.

"Large banks still have a majority of proprietary systems and often customize what they've bought, so replacement's often not a viable option for them," said George Kivel, an analyst at Tower Group and co-author of a report on year 2000 issues in banking.

Though some have predicted 2000 may create a miniboom for software providers as banks look to replace rather than recode, Mr. Kivel said so far he has seen no telltale uptick in banking software purchases.

"While we'd have expected by now for banks to have seriously investigated the buy-versus-build decision and the reintegration costs, many are still just involved in initial assessment," he said.

Mr. Kivel and others said the fact that many banks are ignoring the issue is a growing problem.

The consultants said no "silver bullet" applications had emerged that would make updating for 2000 easy and quick.

And third-party partners may soon reach a point where they have too many clients to allow new projects. "The big issue is, will we all have enough people to do fixes?" said Mr. Goldhammer.

Some tempered the ominous tone taken by a majority of vendors. "This is what bank technical staffs are used to doing - patching and fixing - so it's a larger burden but not a complete departure for them," said M. Arthur Gillis, an independent consultant at Computer Based Solutions Inc.in Dallas.

But in general, the time has come for banks to take seriously the threat posed by the millennium.

"Those people who by the end of this year have not at least inventoried their systems . . . are in danger of missing the window," said Mr. Kivel.

For reprint and licensing requests for this article, click here.
MORE FROM AMERICAN BANKER