As a newly designated systemically important financial institution, CIT Group is expected to be able to quickly show regulators it has a handle on things.
That's a tall order when the company has to deal with its legacy systems and data, along with those of OneWest Bank, the institution it bought last year to push CIT over the $50 billion-asset threshold into SIFI territory. As a SIFI, the bank is subject to additional scrutiny and faces tougher stress-testing expectations from regulators.
"Prior to day one" of a merger's closing, "regulators want to know exactly what your plans are, are you going to be able to comply with these new regulations," said BJ Fesq, CIT Group's chief data officer. "Some kick in immediately, and some over time, but you have to show you can comply."
The company believes it has found a way to add swiftness to its systems through data virtualization, a technique that allows applications to retrieve data from disparate sources without the application needing to know the physical location of the data. Essentially, the company now has a virtual data layer that unifies all the different data from disparate sources and systems and provides a single source of record.
"There was a very manual process" before data virtualization, Fesq explained. "Without good data-lineage capabilities, you have to go and manually discover [a certain data point], go through the source code … it's a huge effort."
While CIT is using the technique to help it deal with the challenges of being a much larger organization, other banks might be attracted to data virtualization because it promises to free up resources currently allocated to dealing with disparate systems and allows them to be used elsewhere.
"It is, by all early accounts, a more cost-effective way to manage your data," said Terence Roche, a principal at Cornerstone Advisors. "If you can redirect dollars spent on infrastructure, and you can take that money and invest it in delivery systems, that's a win for any bank."
In general, more banks are exploring the benefits that data virtualization provides as a way to reconcile various systems and sources of data, says Denodo, the Palo Alto, Calif., firm that CIT is working with to build the environment.
"Financial institutions have a huge volume of data, and they struggle to get that single source of truth from multiple systems," said Ravi Shankar, Denodo's chief marketing officer. "These systems drive critical business processes, and they struggle to gain that single, universal view of the customer," he added.
For CIT, the move was not totally connected to its OneWest acquisition. In fact, it started the process with Denodo in the spring of 2014, a few months before the $3.4 billion deal to buy OneWest was announced.
"We started looking at data virtualization technology a couple of years ago," Fesq said. "We were looking to simplify our data environment anyway and go after the agility benefits data virtualization provides. Then we found out about the merger halfway down the road, and we knew this would also help us comply."
Denodo helped the company create a data services layer with data virtualization. Basically data virtualization powers its new data architecture, which allows it to easily integrate new data sources, have cleaner data quality as well as better business process capabilities, such as a faster time to market.
This "lets us understand when people use data and where they are getting it from, and how they are using it," Fesq explained. "It answers any questions about data lineage."
The work on overhauling CIT's data architecture is continuing. Fesq said that, by the end of 2017, when the system is fully deployed, most manual searches for data will no longer be necessary, saving the bank millions of dollars per year as it establishes a "single source of truth" for each piece of data. This not only helps with the increased regulatory reporting needed with being designated as a SIFI, but also benefits business operations in general. CIT is not subject to the Federal Reserve's Comprehensive Capital Analysis and Review this year, but is required to submit a capital plan to regulators and is expected to be part of the CCAR process in subsequent years.
"It absolutely helps us with regulators, but this is the data environment you should want anyway to run your business better and be more efficient," Fesq said.
Implementing data virtualization technology probably made the most sense when it came to effectively merging two financial institutions that grew quickly in a relatively short amount of time, Roche said.
"They are two banks that scaled pretty quickly," he said. "Their data management issues and data complexity are going to be much more complex than with a $10 billion bank."
Further, he added, CIT will probably want to grow even more, and data virtualization makes that process easier.
"You don't see a lot of banks that want to be $50 billion and one dollar" in asset size, he said. "If you go past $50 billion, you want to go a lot past it. This project seems like a refresh for two banks that scaled quickly and probably want to continue to scale quickly."
Roche said larger regionals and big banks are the ones mostly using data virtualization, while smaller banks may take longer.
"The challenge downstream is that they are very heavily dependent on their core vendors, whereas a bank $50, $60 billion and above, they have money to invest in additional technology resources," he said. "Smaller banks may not be early movers; they'll have to see if their providers start to offer this."