It is popular to describe the securities services provider as a curator of data, which has led to eager discussions around the potential monetisation of that data, especially given new and enhanced regulatory reporting requirements. The Dodd Frank and EMIR regulations have moved a mass of data online as derivatives become ever more exchange-driven and CCP-settled. New physical aberrations continue to raise their heads, but their life span appears to be short-term rather than embedded.
Many institutional investors see the custodian and fund administrator as an insurer for the safety of assets, a processor offering a performance guarantee around the servicing of those assets, a reporting agent of the events pertaining to those assets and a valuation agent for those assets. They may have associated functions around asset financing, cash management, treasury, capital market execution or collateral management. All of these give rise to information needed at client, regulator and custodian or fund administrator level, but the charges levied for the core services already cover many elements of the related cost. MiFID II is the latest animal to create demand for more data; indeed, with over 60 data elements demanded per transaction the quantum is in the petabytes!
From a post-trade perspective, the issue is where the data curation role of the post-trade world starts and what useful role they can provide. The reality is that the complexity of data, data formats and data definitions in a global market place makes reporting a challenging and costly exercise for all but a few of the largest or most mono-market providers. There is definitely room for the data concentrator. The moment funds become global and have a regular transaction flow, the cost and complexity challenge kicks in. MiFID II is a case in point; somewhere within its 1.7 million paragraphs we can see it encompasses extra-territoriality, microsecond speed of reporting, potential conflict of regulation, the likelihood that directives in individual states will result in divergent approaches and a need for five-year data retention.
The speed for data reporting, the quantum of data needed, the risk of erroneous input in any chain of communication and the regulatory penalties possible for any hiatus in flow whether system outage, network latency or human error all make the data world a high-risk zone. I do understand that some data aggregation can be commercialised as value added, with trade or regulatory reporting in the formats demanded by each repository or regulator being a case in point. The question is whether the industry has the real competence and whether client-to-post-trade interaction is adequate.
A lot of the data aggregation needed can be achieved through the application of simple report writers. A lot of the data held at custodians will not be proprietary. And a lot of clients see data products, drawn from their holdings with a custodian, as part of the overall service proposition, often needed to be combined with data from other providers to provide a data pool that they can then exploit to optimise their own operational and financial performance. Of course, suppliers can produce data universes from their aggregated client base and logically need that to understand the costs of each relationship. Whether that data is exploitable is a difficult question for few have a universe that is comprehensive outside their domestic markets and aggregation of the data gives rise to issues of client confidentiality and data protection laws.
I suspect that we will face up to the perennial problem. Initially there will be an attempt at unbundling the charges and creating a new revenue stream. But given the elasticity of demand for data, any volume-based charging will add a further unknown cost to the asset manager. One suspects we will have a tension between ad valorem charge and unbundled charge with new data curation and reporting becoming a component of the ad valorem rather than a separate charge. In short, the added services risk being an added cost to withstand the inevitable downward pressure on ad valorem charges in a market suffering from severe overcapacity.
When, in a few years’ time, with so much data retained, we consider its value, I am sure that we will meet up with the age-old problem. Quite simply the data held will prove to be inadequate to help resolve any major issue both because of its quantum and quality; and especially the technological competence of the holders whether in the regulatory or private sector. The main value of the data will be to allow near- term analysis of any market abuse, statistical analysis of matters such as systemic risks through concentration analytics and other tools and undoubtedly a raft of reporting offerings of perhaps useful but probably indiscriminate value.
We are in the age of data drowning and more data is the apparent solution. The reality is we need more focus. Much of the new regulation has valid aims in that it seeks to provide market transparency and integrity. But, as always, the lawyers and regulators have transformed it into a dangerous swamp of mass data without a strategy and, in some cases, as a form of domestic market protectionism rather than ensure that they receive the right information in a timely fashion and can act on it in the best interests of investors.
The post-trade industry needs to be wary. Data is a powerful tool. Curation of data is more than just custody. Liability could be more than they understand today unless they carefully read all the clauses of all the regulatory grammage that is in circulation. And, to cap it all, who is right? The CEO of Intercontinental Exchange (ICE) who feels MiFID, as an example, “is the worse piece of legislation I have seen in the history of my career” or the redoubtable Kay Swinburne MEP who felt “legislators wanted to create a fair, competitive, effective market with a single rulebook across the EU”?