The requirement to file internalised settlement reports under the latest instalment of the Central Securities Depositories Regulation (CSDR), has highlighted the state of the post-trade technology in today’s market.
In a survey of custody and post-trade members conducted by The Network Forum a month before the 12 July deadline, only 38% claimed to have understood and assessed all impacts of CSDR on their business. While it is possible that there was a late scramble to understand and comply, another conclusion is much more feasible – that the industry does not know how to handle its data problem.
With 2.5 quintillion bytes of data created globally every day*, how data should be managed is clearly a growing concern for most industries. In the post-trade environment, regulation has only served to generate more data, rather than dealing with the existing problem. MiFID II timestamps, KYC, AML and the latest iteration of CSDR, are just a handful of suggestions from the regulators that have added to the rising tide. In contrast, the only directive trying to regulate data so far has been GDPR.
While the regulators have well intended ambitions, the reality has seen companies struggle to comply with the wave of regulations rather than addressing the broader problems. This means that the true benefits of the wall of data are yet to be realised.
Legacy systems weigh on operations
Outdated legacy systems have been a problem for the industry for several years now, but the growth of the ‘data problem’ is shining a big spotlight on the issue. More than 20 years ago, the main post-trade priority was settling trades, but while the market has evolved significantly since then, technology has been left behind.
The old technology systems – many of which are still in operation today – were simply not built to handle the level of data and scale of operations the industry processes in today’s world.
The additional data points required for trade processing and converting them into pre and post-trade transparency, was never a consideration when the ‘latest’ versions of technology were rolled out in various forms around two decades ago. Today, the modern regulatory framework and market structure agenda seek to normalise how all trades are processed – an almost impossible task to achieve under a simple taxonomy.
The first issue lies in communication language e.g. SWIFT. Not all SWIFT participants communicate using the same methodology. Not all financial institutions use SWIFT favouring Flat-Files or FIX as alternatives.
Secondly, the manner in which technology is bought, hosted, developed and re-purposed, means that tactical operational system processes are as unique as the people that operate them. This second issue is compounded largely by the various ways that business process and back-office systems have evolved in each organisation, namely through client demand, instrument coverage, technology policy and budgets, not to mention additional external events such as changes in vendor ownership of key software solutions.
Mining such a disparate pool of data sources, not only prevents firms who have a golden source of data from being efficient in transaction processing, but also impedes their progress to monetise data with real value-based analysis, attained by extracting information groups that could (if a golden source was achieved already) drive the delivery on machine learning and artificial intelligence much further along the agenda.
As the current waves of regulatory change come to a pause, the opportunity to innovate still takes a backseat behind compliance.
The inherent risk within incorrectly stored or mismanaged data, creates the potential for exploitation and reputation damaging fines. Almost 18 months since the introduction of GDPR in the European Union, many firms must now treat data breaches with the same level of risk as CASS breaches. With a number of high profile and eye watering fines already issued, albeit to non-financial firms so far, many other firms could be considerably financially disadvantaged too. Getting this right will go a long way to protecting ourselves against significant financial loss.
Another factor adding to the time pressure on the issue, is the inorganic growth of businesses. Consolidation is becoming increasingly common across Europe as banks and businesses seek cost synergies and scale. But with scale comes greater complexity. Having larger and more sophisticated banks does not reduce the problem, instead it doubles it.
For many, consolidating for increased synergies has been countered by many disparate data collection avenues and storage methods, and this is without factoring the amount of patchworks to the systems so that they continue to operate.
Compliance over innovation
Recycling the wealth of trading data created has become the latest feature of how the industry survives. Satisfying the latest regulatory requirements by replacing data with what the regulator wants, rather than what is best for the business, has driven investment decisions in recent years. This has led to a short-term patchwork approach to trading infrastructure – a slow, temporary measure to evolve systems in the face of immediate challenges.
Reforming the post-trade technology environment is a necessary step to the growth and progression of the industry, with the effective harnessing of data sitting at the centre of innovation. Data should be seen as an opportunity. For example, it could be used to reduce or avoid exposure to the financial penalty regimes expected, following next year’s CSDR settlement discipline. With deep analytics on current settlement trends and fail codes per market, instrument and counterparty, the inability to prevent settlement errors caused by poor management of basic data sets such as SSIs will have financial impact, and publication of the industry’s worst performers by each in-scope CSD.
Instead, a fundamental failure to adapt business models and systems to accommodate and capitalise on this opportunity means many are struggling just to survive.