Data: The blueprint for success in the digital age

Global Custodian recently caught up with Rimes’ head of data products, Sam Barber, who discussed how a new age of digital services can help securities services providers overcome some of the complex challenges around data management.
By Rimes

What are the biggest challenges facing the asset servicing community concerning benchmark data management? 

In today’s competitive asset services environment, all firms acknowledge that the need for accurate, timely and accessible benchmark data is paramount. It stems from the industry’s non-negotiable goal of having a validated, centralised and cost-efficient view of their benchmarks. Suppose you think about the position you want to be in as a user of a centralised benchmark system. In that case, you need to be able to easily access the data you are authorised for, link it to the appropriate end-user, apply consistent identifiers and understand what modules you require to satisfy your client’s requirements most efficiently. Given the complexity of benchmarks, inherent management skill gaps and constantly evolving requirements, this is often seen as a utopia but solutions to these problems do exist.

It’s important to remember that benchmark data management is broad in application and has unique challenges. With user expectation for accurate data at an all-time high; firms also have to contend with multiple providers delivering disparate data sets that can be very difficult to validate, consume, combine and store. Additionally, being able to comply with onerous data license terms alongside the task of identifying the most suitable and alternative data sets, followed by the intricate management of metadata, is very daunting. These challenges amplify the need for modern, flexible technology solutions that simplify data management, enhance operational efficiency and empower asset servicers to navigate the intricate landscape of global investment services with greater agility and confidence. 

Simplifying the delivery of validated benchmark data in a clean, usable format can significantly ease the burden on asset servicers, enabling them to better support their clients by providing authorised data that is immediately usable. On licenses, one approach involves creating transparency around what data is accessible throughout the distribution chain. Data can be tailored to their specific needs, demystifying the inbuilt complexities. In our experience, establishing a consistent, unified, verified view of benchmarks across multiple sources revolutionises data integration practices. This approach facilitates more efficient data use, enables easier identification of suitable data sets and reduces the risk of costly mistakes. This becomes a powerful paradigm, especially when you combine it with a comparison tool to aid this process by allowing the quick identification of compatible indexes for specific mandates. 

Tackling the complexities of benchmark data management head-on improves operational efficiency. It also empowers asset servicing firms to navigate the ever-changing landscape of global investment services with greater agility and confidence. Modern solutions are proven to transform daunting data management challenges into strategic assets, paving the way for more informed, efficient and successful investment strategies. 

How is the increased demand for data analytics reshaping the securities services space? 

In 2004, former Hewlett-Packard CEO, Carly Fiorina commented, “… the goal is to turn data into information and information into insight.” That remains valid today, particularly in the securities asset services space where we are seeing increased demand for data analytics, underpinned by an intense desire to reshape the business by transforming legacy processes and injecting modern technology solutions. For example, an emerging requirement is the immediate delivery of actionable data – supported by generative AI, LLMs and related technology breakthroughs, to enable ‘data at your fingertips’. 

However, implementing a ‘data at your fingertips’ environment is complex—especially regarding governance and data integrity. The main challenge is ensuring governance and operational stability without introducing latency and blocking projects. We are all about helping firms enhance their operations while avoiding a sprawl of uncontrolled processes. Maintaining control requires experienced teams dedicated to ensuring the data is in good shape and that data storage is flexible, easy to use, and fit for purpose across multiple use cases. In our experience, outsourcing allows firms to accelerate projects or improve productivity by dedicating their resources to focus areas.

In my opinion, outsourcing is an area where generative AI is key. We have found that deploying AI is exponentially more straightforward, mainly when the underlying data is meticulously structured and understandable. This way users can eliminate having to spend time fixing underlying data problems and embed technical IP into their tooling. This approach is compelling because one can liberate the workforce from mundane tasks such as scouring data dictionaries and methodology documents. It also makes introducing new functionality more accessible and substantially more efficient.

The industry continues grappling with data fragmentation, quality, and growing data volumes. How do Rimes’ solutions help to tackle these issues? 

Data fragmentation, poor data quality, and growing volumes are the industry’s scourge, resulting in hard-to-manage, expensive-to-maintain data landscape scenarios. This is further accentuated by changes in regulation. A Carne Group article recently commented, 99% of surveyed fund managers said it will become much harder to navigate regulatory complexities. Unfortunately, many firms still take a quick-fix approach, which typically involves a binary approach to the problem, meaning all data sets are treated separately or the same. This is proven to not only be catastrophic for day-to-day operations and may not give the required flexibility for future changes – forced or otherwise.

Our best-of-both-worlds suitability approach for the sourcing, ingestion and normalisation of data to provide a client-ready view, with the ability to pick from any data sets from the primary source. Similarly, the data can be validated using both Rimes IP and client IPs, ensuring it meets the end users’ precise requirements. For example, data that is suitable for front-office users, providing as-published data, alongside other users who require increased levels of validations or adjustments for the highest degree of precision. In terms of scaling, by leveraging the most up-to-date technologies and hosting in the cloud, we embed solutions that can handle vast data volumes that are easy to access. 

There’s a great expression – “data quality is everyone’s job.” But I would expand it to: Data quality is everyone’s job, but experts should lay the groundwork and do the heavy lifting.

Many institutions across the industry are investing in new data platforms that require benchmark data. What part does Rimes play in supporting these initiatives?

Benchmark data management is highly challenging, requiring specialist knowledge and experience. It’s no surprise the same study I mentioned earlier stated, 41% of the respondents said that they expect to dramatically increase their use of third-party service providers when it comes to their fund administration function, or that new data platforms are spinning up to support this demand. As they move to these new platforms the expectation is for a high-quality service with a strong backbone of benchmark data management expertise and the ability to scale.

As asset servicers deploy state-of-the-art data platforms, several vital considerations come to mind. Foremost is the need for highly automated, seamless integration capabilities to ensure diverse data sources are unified into a coherent, accessible framework. Additionally, the flexibility to adapt to changing demands (both internal and regulatory) and the scalability to manage growing data volumes are critical. Embedding data security and compliance with global standards in the platform’s design to safeguard sensitive information is also essential. Lastly, user-centricity is paramount; these platforms must offer intuitive interfaces and tools that empower users to extract maximum value from their data with minimal friction. 

The way I see it, data platforms aim to add their unique secret sauce to maximise client benefits. However, the essence lies in using data powered by a vendor known for robust data quality and a deep grasp of both industry benchmarks and client challenges. 

«