The road to automation is fraught with difficulties

John Gubert highlights the intrinsic challenges the industry faces when seeking to modernise the often archaic systems.

As securities services continue to experience material product and scope change, the target of total digitalisation becomes a far greater challenge. Automation in periods of change is always difficult for no one can forecast the precise future of reporting, cyber securities, infrastructure models and much more. 

I see three core challenges. The growing challenge of legacy. The volume of new products and reporting. And the inevitability of more complex infrastructure models. And overhanging these we have reporting and settlement deadlines becoming more difficult and more costly to manage, especially as we move into T+1 with cash penalties for fails. Add to that an ever more hostile cyber security world which compounds the processes needed to ensure secure flows across each transaction life cycle. 

Most firms have a blend of legacy and successor technologies. The successor technologies are diverse with add-ons and linkages proliferating more and more over time. For decades many firms have ducked the inevitable; no matter how solid the core processing system, or systems, current overall IT architectures are complicated and fraught with latency challenges. Many systems and applications are not structured for the data rich business we have become. Their often batch or poorly synchronised processing militates against tighter reporting and settlement timeframes. 

In 2008 I first went to a conference where I was told that blockchain would supersede all extant transaction-based technologies and my scepticism around the timeline was treated with courteous derision. Australia embraced the new technology immigrant with open arms and created a massive development programme to use it as the base for its Chess settlement system replacement. But after a seven-year development programme they announced the abandonment of the plans. I still see DLT as a component of future IT architectures, just not the total replacement. And that complicates life as replacement of core legacy platforms and their allied processing plants and applications are unlikely to be a one-on-one process, but rather a one-to-many in a more distributed linked environment 

The move from legacy to modern should also logically look at different forms of outsourcing; to shared private utilities, common industry wide utilities, use of cloud and more. The silos of the past are no longer relevant whilst cost constraints militate against internal developments for all but the larger firms. 

A good reason for this lies in the changing nature of the product. I do not see tokens, cyber securities and other new products replacing traditional securities in the foreseeable future. They will co-exist. 

I do see the scope of reporting products growing, both from demand by regulators and from clients. But markets need to remember the trap many fell into in the early days of automation. We did not build data pools but data swamps. Data reports must be relevant to the business as a whole and the specialist owner of the business component. I question how much data we can absorb and how much data is duplicative. Two decades ago, I had my team analyse the data I was receiving and the detail of information that came over my desk. The challenge was not to massively cut the amounts, but to direct them appropriately. This is more difficult these days, if job titles are to go by, we risk having multiple managers of fragments of the business with the total overview being seen primarily at the top of the hierarchical pyramid. I cannot say whether this structure is an improvement over the more generalist approach of my era, but I do question the balance between specialist and generalist. 

Industry infrastructures create development, and especially interface, issues for markets. We have seen a welcome commitment to universal message standards by the major infrastructures. But there are substantially more infrastructures than a decade ago, primarily due to the reporting and regulatory tsunami we have experienced over the last two decades.  

We also have the fragmented cyber world with its paucity of standards, both operational and communicative. The reality is that I see no one outside the infrastructure community who could achieve acceptance and effectiveness in aligning all this growing disparity to the norm of the existing traditional markets. The challenge, whether using a single or multiple suppliers, to achieve a logical global business architecture is complicated by the country agnostic and global nature of new products. The reality is that the concept of domicile is breaking down, with challenging ramifications for legal ownership and risk management. Creating norms within infrastructures, governed by mature financial laws and regulations, is a logical solution to this conundrum.  

Just over a decade ago, I spelt out to the Network Forum’s predecessor conference the challenges of excessive regulation. The situation remains but we need to add technology, new product, and complex markets to the equation. Solving these will create a need for serious change in IT, business, and market models.  

Feedback from markets and the quality of the dialogue in Global Custodian makes one optimistic for the future. Our people are far better qualified and more broadly based than the previous generation. But no one should underestimate the challenge! We need rocket scientists across multiple disciplines to manage that change.