GC Friday Interview: Joe Turso, VP, SmartStream, on Using a Central Data Utility

Data processing costs have skyrocketed since the crisis, and in response, SmartStream and Euroclear have partnered to create a central data utility. The tool allows firms to outsource their data management to the utility, thus sharing the costs with other financial institutions, and once fully implemented, SmartStream's Vice President Joe Turso says the utility could potentially drive down costs by around 60%.
By Jake Safane(2147484770)
Data processing costs have skyrocketed since the crisis, and in response, SmartStream and Euroclear have partnered to create a central data utility. The tool allows firms to outsource their data management to the utility, thus sharing the costs with other financial institutions, and once fully implemented, SmartStream’s Vice President Joe Turso says the utility could potentially drive down costs by around 60%.

GC: How have data management needs grown since the crisis?

JT: Prior to the financial crisis, it was kind of a nice-to-have, discretionary project to consolidate data across a firm’s platform to facilitate risk management. After the financial crisis, it really became a mandatory project, being driven by clients. If you start looking at the industry costs, I think they justify that trend. Prior to the crisis, total industry spend on data processing was probably somewhere around $50 billion dollars. That number has increased now to around $125 billion dollars. And I think the driver behind that is regulatory compliance. It’s really becoming mandatory for firms to cross-reference and standardize on the data across their businesses in order to meet compliance requirements.

GC: Where are firms at now in terms of using tools like a centralized utility to bring down costs?

JT: I think those firms have more than one master data management platform, and there’s still silos within each of the individual businesses. I think there’s been a lot of effort and a lot of projects at most of all the major institutions to try to consolidate and integrate that data. I think all of them find it very difficult to do for multiple reasons. Over time, siloed databases speak different languages–I’m not talking about program languages, but I’m talking about their data dictionaries, their attribution, they have different identifier schemes, they have different governance model if they have a governance model–so it’s very difficult to integrate that data, it’s very costly, and I think that’s what all these financial institutions are experiencing right now. A lot of them try to use the latest detail tool technology out there to accomplish that, and I think they’re finding those tools have grown over time and have become better in terms of their abilities, their speed and the ability to configure them more easily, but they’re still very extensive implementations. So that’s where I think reference data utilities come into play now. The reference data utility is trying to take the complexity out of the standard in-house build and move that to the utility. For example, the process of normalizing data across different vendors, the process of creating a centralized cross-reference across vendors, that work is being done within the utility. We’ve moved the complexity out of those implementations and into the utility, and essentially that cost can be shared across multiple banks and financial institutions; the cost of integration comes down. Essentially what we’re doing is all the normalization, all the cross-referencing at one time, and all the clients can benefit. So when you go to integrate, it’s much easier because you’re dealing with a standardized set of data, while we’re trying to do all that cross-referencing internally.

GC: What are some of the barriers and concerns firms have with using a utility?

JT: I think it’s a different concept for people to understand. It’s a very unique offering in the marketplace, so I think folks are still trying to get their heads around what the utility is and how the utility is different versus a standard market data management (MDM) platform, and I think that’s really the challenge. People need to sort of think outside the box in terms of how a utility can play into their MDM and governance model. People need to get over that [concern] too of outsourcing to a utility. Most of all financial institutions have outsourced, but they still have trust in the way they have complete control. With the utility model, you’re letting the utility do that normalization, letting the utility do that cross-referencing. So I think people need to take that leap of faith in terms of the utility performing some of those functions for them.

GC: What’s stopping other vendors from creating a competing utility, and what would make firms want to choose one over another?

JT: A lot of vendors would say they have a data utility platform, but most don’t. I don’t really think anyone does aside from the SmartStream offering, and the reason being is you need to be a multi-tenanted platform. Essentially that means you can not commingle or share data across vendors and across the clients. You need to manage the data centrally, but you have to process the data separately. So you have to be thinking very differently about a utility versus a typical MDM platform. Most MDM platforms are based on creating golden copies and arbitrating data. In a utility environment, you’re not allowed to commingle data, you’re not allowed to arbitrate data, you’re not allowed to compare data. So you have to manage data very differently. We standardize data to a common object model and then we apply data a consistent set of metadata and data quality rules to manage that data, but at no time do we ever compare or commingle that data. To be a utility, the architecture has to be very different [than a typical MDM platform], and most platforms out there today have been architected to create golden copies and commingle data.

GC: How do you anticipate firms will react to this utility over the next few years, and do you have any plans to enhance it?

JT: I do think utilities are the way of the future. Just what I’m hearing from the major financial institutions is that a lot of them are ready and receptive to this idea. I’m really encouraged by the amount of conversation going on in the industry and by the amount of activity that we’re seeing. We’re seeing people thinking really, really hard about utilities and getting on board with those concepts. I think for the next couple of years, and even going into this year, you’re going to see much more activity in the space, and I think the Tier 1 banks are going to lead the way, and once that happens I think you’ll see the middle tiers as well as the buy side get on board with this.

With the utility itself, it’s all about enriching the content. We’re constantly expanding our content.

«