Modernizing legacy banking systems



Legacy core systems and  data ecosystems are blockers to the transformation agenda at many banks.   Ab Initio Software runs on platforms from Mainframe to Cloud.  We process natively data formats from bit-packed VSAM and IMS to MongoDB or JSON, and orchestrations from batch to API.  We will explain case studies where we have helped banks modernize and transform, to tap into legacy data assets, to accelerate time to market for machine learning and next best action decisions, to manage instant payments and to rationalize and simplify their data processing ecosystems.

Case studies will vary from decommissioning mainframes to wrapping legacy payment systems to expose API, from data and systems migrations to new architecture to deploying dev/ops automation for cloud data pipelines.  We will describe business drivers, the technology mission and business outcomes.

Transcript:

Speaker 1 (00:06):

I've been a bank officer at JP Morgan and earlier in my career I've worked at Citi and other financial institutions. Today we're going to talk about lessons learned that best practices and core systems modernization. We're going to review a couple of case studies. I just want to level set with everybody first of all, in terms of this is a broad spectrum term. What does it mean? It is any and all of actually fundamentally replacing core systems of taking core banking processing systems like a deposit system or a lending system off of a mainframe or legacy infrastructure and modernizing it. And often in these days, in times that is migrating from a monolithic application structure to an API first strategy in which banking products and services can be recombined for new customer experience. It's also, modernization also would mean we're keeping the mainframe or at least during a transitional period, we're going to continue to have a mainframe or a legacy system and we want to reduce the cost of that. So it's optimizing the cost strategy, not necessarily creating business agility. And then there's also a third general definition of this, which is leveraging the mainframe in terms of data and processing and integrating it with the rest of the bank with the goal of increasing business agility by leveraging bank data and processing. So AB Initio software has experienced in all of these different strategies and approaches, and we're going to talk about this. So why modernize core systems? Again, just to level set post that you have probably known. So this is a longstanding issue that the meantime to repair defects in systems that are decades old, 20, 30 year old systems. The meantime to repair defects in quarterly release cycles is increasing. The amount of business functionality in each release tends to decrease and there's increased operational risk as the original workforce that designed and maintained and supported these systems as is either on the verge of retirement or already has retired. But this is the last factor that we're seeing that's really being a real accelerator here, is that the bank is losing business due to lack of business agility. So the first three factors are kind of Xed because they were often not sufficient to drive a case for change, but the latter factor in the digital economy is killing many banks and maybe your bank because quarterly release cycles don't keep up with the pace of change required in the digital economy. By the way, feel free to interrupt or ask questions or comment at any time. I'm from New York, you can't phase me. One of the challenges with core systems modernization, if it was easy, it would already be done. So core systems are often the heart and lungs of a bank. They represent incredible business and technical risk in the execution of this. These are systems that are, as I said before, decades old, tightly coupled, undocumented. The challenge of how to do this with, again, if you're a large enough bank, you can't do this without approval from your regulators because a change in core systems can constitute a risk to the banking system as a whole. So there has to be a way of executing these type of modernizations without constituting great business risk. And what we're seeing though is multi-year efforts. I heard somebody in the conference here quite recently said that they expect it's going to be a 10 year journey for them. So a 10 year journey of getting out of core banking systems, you risk stakeholder fatigue. And this in a couple of different cases of some banks that I can mention has resulted in half migration. So you've got some of the mainframe kind of stuck in place. You have some of the new architecture in place. You've only introduced increased complexity because you haven't executed the whole of the transformation program.

(04:49)

So how do we avoid business stakeholder fatigue that the business pulls the plug on a 10 year program on year two, three or four. So the key here is how to execute predictably with regular business delivery to show tangible improvement in your original business case by systematic delivery that people can see that this is a multi-year program that's working and maybe it doesn't need to take 10 years to deliver business benefits. So there are four ways there that AB Initio show's experience in how to provide business value during such migration efforts. The first, and we're going to go into each of these, but the first is to bridge the legacy to modern infrastructure. This can't be a big bang. So you're going to have to do a phase migration. You're going to need to keep reconciled and consistent positions and balances and accounting across legacy infrastructure and your new modern infrastructure as you evolve and do the migration systematically. Secondly, there are ways of offloading processing for cost savings. This is a way of showing tangible business outcomes as you go. There are ways of increasing business agility to either in implementing straight through processing. Many of these systems are batch in nature. There's ways of modernizing logic and integrating, for example, with one bank, we're integrating COBOL batch services with Java API services with AB Initio API services to create more straight through processing. So there's solutions there. And also as I said before, incrementally you can start to leverage the mainframe more. This might be contrary to your larger strategy, but it's right larger IT strategy, but it might be necessary for the business to get every ounce of juice you can out of the mainframe even as you're getting off the mainframe. Sorry if I'm using the terms mainframe or legacy systems incrementally or arbitrarily, but that's kind of the way many people do. So one way of unlocking data for new applications, and this is a design pattern that comes from one of our customers in their architecture is they pull together all of the different data resources that they have into a data infrastructure. So you're replicating the contents of the legacy system in an operational data store, a data lake or data mart. These are different types of data architecture to support different use cases. Key to this is AB Initio as technology to take unlabeled physical data assets and discover with machine learning and other techniques what the business content of, we have a field P47 in an IMS data segment and you want to know what that really means. We can discover this by a couple of different methods and strategies that we have. And we find this is key to leveraging mainframe data and unlocking data on the mainframe is getting business transparency into data contents registered to them in a data catalog service. And so sharing data resources helps you decommission incrementally as you go because you're understanding what in legacy world can be shut off as you are turning on new capabilities.

(08:33)

So the next is this bridging concept that I mentioned before. So migrating a portion of a system at a time key to such a capability in Ab Initio CIO's experience is the fact that we can operate as this middleware layer that can run from run on the mainframe or run in the cloud or whatever your new target architecture is. We've run on many different platforms. Visit us at booth 400 for a deeper discussion on that. We can process all these different types of data, legacy data structures and we can process the modern data structures that we're seeing across the banking landscape. These are just different data formats for AB Initio. We can process them all. And what we're doing in many cases is we're registering exposing like data catalog sets that I mentioned before as API services for other applications to consume. And you can then migrate in a controlled fashioned decommissioning, by the way, does not happen accidentally. It has to be thought of along your execution. I worked for a CIO at JP Morgan many years ago and I came to him at a business case and he said, well that's great. You're going to build another regulatory data mart. And he said, tell me what three data marts you're going to take out. I'll give you enough money to start the program and you have to fund your business by decommissioning the others. So that's a good discipline. And whether your CIO or CFO or that tough or not, you have to be prepared to show business value and if you just double it cost, you haven't created business value. So think about decommissioning incrementally as you go, which means you need transparency into data contents to understand what data and processing was on the original mainframe host or legacy host that you can then turn off as you're turning on new capabilities in new architecture. So as you go on these programs, and again to avoid stakeholder fatigue, you want to show that you're increasing use of data. You want to show that you can reduce operating costs and defer like upgrades and adding new LPARs and things like that, which are off often very expensive. And so there is often low hanging fruit. There is batch processing that happens on the mainframe that you could quickly convert to AB Initio. We're a general purpose computing environment. Anything that COBOL can do, Ab Initio can do. We translate COBOL to AB Initio. So we can help you migrate that COBOL onto the cloud and we can spoof and replicate IMS structures or whatever your mainframe data structure is, which you won't necessarily find in a cloud model. So there's, solutions that we have for these things to help people migrate application software off of the mainframe and again, batch processing like this, billing statement, things like this. So they're not core transaction processing that have accumulated on the mainframe over time. We can migrate systematically, incrementally and you can start to show cost savings as you go. We have a whole testing program for this for comparison. Did somebody have a question? No, I'm hearing voices in my head. Alright. Another big win that we find, and this is several banks that have done this with us, is they actually look at things like their deposit system and they analyze what's using a lot of mips and there has been an enormous uptick in mips consumption since the introduction of smartphones. So banking apps with smartphones looking at what's my balance? Did that check clear? Did my wife make that payment on the mortgage? Those balance inquiries consume a lot of MIPS costs often. And if you analyze your own hosts and systems, you may find that to be case. And so what we have done in those cases is implement change data capture facilities, and we have change data capture for DB two on the mainframe host. And also we've done this in some circumstances with IMS as well. And so you can move that data off the mainframe onto, and here AB Initio really doesn't care what you do it as your target environment. For some people it's a different data store on the cloud, but whatever that data store is, we publish that data and then the smartphone will then query against let's say a cloud system or an on-prem system instead of consuming MIPS cost. So again, this is a strategy for reducing cost to show business value. As you're in the cost of a migration, you might not ever want to migrate off the mainframe system, but you still might want to show modernizing and cost optimization to improve your efficiency ratio. Another capability that we have is this ability to do straight through processing. I alluded this before, combining real time processing with batch processing memo posting, for example on a Hogan system, is done as a shadow accounting system during the day. And if many of you probably know at night, the Hogan deposit system actually posts and clears all the debits and credits that happen during the day. And so we have looked at this for a number of different customers and specifically on the Hogan banking system case. And we've seen the surrounding COBOL systems that surround Hogan that are processing batch in a batch way and that are supporting memo posting and different activities that are integrating with let's say legacy Java architecture. We can orchestrate, translate the cobol, render the COBOL as an API service in AB Initio and then integrate that API service with the original Java that you might have had for intraday transaction and combine them and orchestrate them within an API solution. So this is another way of creating a certain amount of business agility with your existing mainframe allowing you to modernize. Also, a real advantage for this is by exposing API services that are maybe using mainframe data and processing insulates other applications from the fact that you're going to be changing the mainframe over time. So it's an insulating and abstraction layer that enables incremental migration.

(15:30)

And this is also just in general terms, what we're seeing is this ability to implement APIs will also expose mainframe logic for new products and services that you know could have a kicks transaction process that maybe helps you configure some aspect of customer experience. Leverage that experience with an API call to kicks to run the transaction process and then a remote processing call would by whatever means necessary remote job entry. All of these are different interfacing paradigms that we have experienced with. So thank you for your attention. I want you to take away a couple of messages here. So if I fall off the stage and die, I want you to walk away with a couple of thoughts. Firstly that this big bang thing is never going to work. So you need incremental controlled migration. This is crucial and you need to manage your stakeholders. And your stakeholders include the Federal Reserve to provide predictable delivery of business value. This is for your internal stakeholders. So you want to show progress, reduction of cost, increase of business agility as you go. This is the only way to avoid shareholder stakeholder fatigue. And as I mentioned, if you have stakeholder fatigue that results in the program being abandoned at year two, year three, that often creates a worse mess than what you had in your old original legacy world. So this is really critical that you have to have predictable delivery. You're understanding what you're migration, migrating a portion of a portfolio, a portion of your customer base at a time as you're migrating to new systems manage risk. Okay? There's a pallet approach approaches that I illustrated here for doing different things for affecting total migration, for affecting reduction in cost, for affecting increased business agility. As you do these migrations and modernization and upgrade. So you can unlock the mainframe data and process, you can offload low hanging fruit and legacy processing. You can migrate functions, core function courses and functions a block at a time. And again, a block of functionalities, a block of the customer portfolio at a time. You can combine the two and minimize your risks. So a further interest Ab Initio never sells, take nothing. I said at face value. Everything that we say, we will back up with a technical proof of concept. We will show you these methods. If you have a business challenge in upgrading your core systems because you need reduction in cost because you need business agility to keep up with the changing needs of your retail or commercial customers, we can help you. You have a POC that proves a business case, we will invest in it at no charge. We never sell without proving our value first. This is kind of, we're not transaction oriented company. We've been in business for 27 years. 50% of the world's globally systematically important banks use AB Initio. 50% of the domestically systematic banks in the us, Australia, Canada, and Japan, countries around the world use AB Initio software and have come to rely on us for mission critical applications. We will prove the value before we undertake mission critical transformations. We've done whole system replacements, we've helped businesses transform. There was one customer that was a credit bureau, they had their data architecture was encapsulated on a mainframe environment. We entirely replaced that because their business model had changed. They wanted to be not just a credit bureau anymore, they want it to be a full service information broker to banks. So we can do business transformation and aligned with technical transformation. We're a platform for driving change. Visit us at booth 400, you can contact us. Send us an email sales@abinitio.com. We have a web presence, there's a website. We're at the conference. I'm here. Let me open it up right now. Any questions and I'll, improvise answers. So again, we're here to help. It's just feel free to reach out to us if it sounds too good to be true. It's not, This is very difficult, very risky stuff. We understand this is really sensitive as serious as heart and lung surgery. So we approach this with grim earnestness and confidence in our engineering culture that we can help you. Yes.

Audience Member 1 (20:40):

How long does these program take?

Speaker 1 (20:44):

So there's many stars need to align for a rapid transformation in the case of the credit bureau that I mentioned before in that transformation that happened in, yeah. Oh, So let me repeat the question. The question was essentially how long is a piece of strength, how long these programs take? So in the case of the credit bureau that I mentioned before that had an critical imperative to change, their whole transformation was done under two years there at an American Banker last year. There was a woman speaking, Kristin, I can't remember her last name, from Zions Bank. She talked about this being an 11 year transformation. Again, an 11 year transformation requires a certain amount of fortitude. And I don't know how many CEOs or CIOs are going to have the staying power to believe they're going to have their job in 11 years. So we think that it can be done in a couple of years much faster. We have many different ways to work with you, but these are large programs. We're also a portion of the solution here. I don't want to mislead anybody. There's no magic in life. So understanding your current data we can help with, but understanding your processing dependencies and your systems flow. These things are, and your data dependencies across systems and architecture. These things are often embedded in JCL or job control language or in embedded logic within applications. These things were created over dozens of years and sometimes not with a lot of good architectural guidance. So it's a little messy sometimes. Many of the major SI's have an experience there. I was worked at IBM, I worked at Price water house before this. I will tell you, you get the best value out of an SI, the more demanding you are. So you have to look at them and say, look, we want a structured analysis of our program. We want it in this many months. Don't drag this out. And we need to functionally decompose our processing system so we can implement increment, execute incrementally, and then we can do data analysis to give you an idea of the value at risk in the portfolio as you move a piece at a time.

Audience Member 2 (23:05):

Hi, So I'll give another example. We had one customer applications consultant, 14, and at the timely batch two miles day, but at the time was improved. This conversion you're doing, most of the effort is around testing to make sure that you're aggregating same results. So become more and more testing project as well as a continued integration, your legacy system processing platform to processing.

Speaker 1 (24:12):

Yeah, I mean when we get into it, because this is a public form, I can't go into names, but if we were in a private conversation, I could tell you some of the customer names, but these are big Fortune 500 companies that we've helped in this regard. Global capitalism has depended on us executing some of these projects. So we have a lot of experience in this. Any other questions?

Audience Member 3 (24:42):

How small of an organization is a small, too small? You're working with going to have small?

Speaker 1 (25:00):

So in general terms AB Initio's marketplace tends to be the larger, more complex financial institutions. We have done work with financial institutions in the 50 billion asset size, but lower than that, we find that it's not really a fit. We're not claiming to be a fit for everybody. If you have an imperative, I'm not saying that we would preclude the conversation, but I'm just giving you a general sense of our marketplace. There may be other ways of doing what we're doing, but also when we get down to the community bank in the credit union space, we find that the banking systems that are out there, whether you go from Fiserv to FIs or Jack Henry back and forth amongst the three of them there is the switching costs are high and the business benefit valuation isn't that great. So it's a complicated world below 50 billion in assets and it's not our space. All right. If everybody's happy, I might even eat myself. And again, Booth 400 on the floor.