Updating Core Banking Systems: Necessary or Disruptive, Expensive Nuisance?

The heart of the contemporary bank is the core data center, and a recent study from Dale Vecchio at Gartner argues that by 2010, more than a third of all application projects will be driven by the need to deal with technology obsolescence—not to mention a retiring IT workforce who knows how to work those technologies.

The way forward is fraught, neatly shown by Microsoft’s intro debacle with Vista to consumers—suddenly systems and hardware that had worked fine no longer works. Enterprises clamored to go back to the old—i.e. working—software. That kind of expensive and chaotic experience can’t happen to a bank updating mission-critical applications. So there’s a good argument to just let things keep working, then: After all, Y2K didn’t turn out to be such a big deal. But all things come to an end, and Vecchio says planning just isn’t there now. “Most CIOs are struggling to cope with a set of portfolios in which an overwhelming percentage of the artifacts need to be retired and replaced within a comparatively short period of time between 2008 and 2015,” the research vp says. He argues that firms must put strategic planning for modernization as a core goal and apply this capability immediately.

This sounds disruptive. Arlene Yetnikoff, a DePaul University computer-science expert and former Anderson consultant, says staying the course isn’t so bad. “It probably does take quite a bit of ‘staying the same’ pain before changes are made,” she says. “But hopefully we’re learned some lessons along the way. One area we always have to look at now when bringing in something new is …What will it take to move off this new system when we need to?”

For reprint and licensing requests for this article, click here.
MORE FROM AMERICAN BANKER