Morgan Stanley focuses on data quality to strengthen AI

Jeff McMillan, chief analytics and data officer at Morgan Stanley, has long worried about the risks of relying solely on data — if the data put into an institution's system is inaccurate or out of date, it will give customers wrong advice.

And at a firm like Morgan Stanley, that just isn't an option.

As a result, Morgan Stanley has been overhauling its approach to data. Chief among them is that it wants to improve data quality in core business processing.

“The acceleration of data volume and the opportunity this data presents for efficiency and product innovation is expanding dramatically,” said Gerard Hester, head of the bank’s data center of excellence. “We want to be sure we are ahead of the game.”

The data center of excellence was established in 2018. Hester describes it as a hub with spokes out to all parts of the organization, including equities, fixed income, research, banking, investment management, wealth management, legal, compliance, risk, finance and operations. Each division has its own data requirements.

“Being able to pull all this data together across the firm we think will help Morgan Stanley’s franchise internally as well as the product we can offer to our clients,” Hester said.

Gerard Hester, head of Morgan Stanley’s data center of excellence, and Liezel McCord, a managing director of the data center of excellence

The firm hopes that improved data quality will let the bank build higher-quality artificial intelligence and machine learning tools to deliver insights and guide business decisions. One product expected to benefit from this is the Next Best Action the bank developed for its financial advisers.

Next Best Action uses machine learning and predictive analytics to analyze research reports and market data, identify investment possibilities, and match them to individual clients’ preferences. Financial advisers can choose to use Next Best Action’s suggestions or not.

Another tool that could benefit from better data is an internal virtual assistant called Ask Research. Ask Research provides quick answers to routine questions like, “What’s Google’s earnings per share?” or “Send me your latest model for Google.” This technology is currently being tested in several departments, including wealth management.

New data strategy

But better data quality is just one of the goals of the revamp. Another is to have tighter control and oversight over where and how data is being used, and to ensure the right data is being used to deliver new products to clients.

To make this happen, the bank recently created a new data strategy with three pillars, according to Hester. The first is working with each business area to understand their data issues and begin to address those issues.

“We have made significant progress in the last nine months working with a number of our businesses, specifically our equities business,” Hester said.

The second pillar is tools and innovation that improve data access and security. The third pillar is an identity framework.

At the end of February, the bank hired Liezel McCord to oversee data policy within the new strategy. Until recently, McCord was an external consultant helping Morgan Stanley with its Brexit strategy. One of McCord’s responsibilities will be to improve data ownership — to hold data owners accountable when the data they create is wrong and to give them credit when it’s right.

“It’s incredibly important that we have clear ownership of the data,” Hester said. “Imagine you’re joining lots of pieces of data. If the quality isn’t high for one of those sources of data, that could undermine the work you’re trying to do.”

Data owners will be held accountable for the accuracy, security and quality of the data they contribute and make sure that any issues are addressed.

Trend of data quality projects

Arindam Choudhury, the banking and capital markets leader at Capgemini, said many banks are refocusing on data as it gets distributed in new applications.

Some are driven by regulatory concerns, he said. For example, the Basel Committee on Banking Supervision's standard number 239 (principles for effective risk data aggregation and risk reporting) is pushing some institutions to make data management changes.

“In the first go-round, people complied with it, but as point-to-point interfaces and applications, which was not very cost effective,” Choudhury said. “So now people are looking at moving to the cloud or a data lake, they’re looking at a more rationalized way and a more cost-effective way of implementing those principles.”

Another trend pushing banks to get their data house in order is competition from fintechs.

“One challenge that almost every financial services organization has today is they’re being disintermediated by a lot of the fintechs, so they’re looking at assets that can be used to either partner with these fintechs or protect or even grow their business,” Choudhury said. “So they’re taking a closer look at the data access they have. Organizations are starting to look at data as a strategic asset and try to find ways to monetize it.”

A third driver is the desire for better analytics and reports.

"There’s a strong trend toward centralizing and figuring out, where does this data come from, what is the provenance of this data, who touched it, what kinds of rules did we apply to it?” Choudhury said. That, he said, could lead to explainable, valid and trustworthy AI.

Editor at Large Penny Crosman welcomes feedback at penny.crosman@sourcemedia.com.

For reprint and licensing requests for this article, click here.
Customer data Unstructured data Data lakes Data ownership Data transparency Data warehouses Artificial intelligence Machine learning Morgan Stanley Morgan Stanley Wealth Management
MORE FROM AMERICAN BANKER