Technology: the story so far

John Gubert looks at how technology has developed so far and where there is still work to be done.

The technologists have taken over the universe and the post trade world is blithely lapping up the concept that their workplace is transforming. The only question is when? Bill Gates comment that we all exaggerate the impact of technology in the near term but underestimate it in the longer term is cited to prove there will be a ten-year maximum evolution of the market. The hype talks of automation, digitalisation, distributed ledgers and artificial or cognitive intelligence becoming king. The investment advisor of the future will be a robot. The investment manager will be an application. Clients will be digitally connected. The only humans in the value chain will be developers and hackers!

There is major change in flow and technology is the driver. But we need to consider some very basic and boring processing and commercial issues that we have been trying to resolve for decades without much progress and which are prerequisites for the success of the new technologies. And we need to remember that changing communication modes and greater process efficiency will eliminate people from the back and middle offices but will lead to a need for more expensive and high-quality management in areas such as production, new products and delivery.

The major inhibits I see to automation are black data pools; intentional ambiguity in legal documentation, especially in the issuance chain; competitive pressures within product and geographical segments; regulatory or legal barriers to progress, some justified and some ridiculous; and finally, the financial inhibits, whether quantum of funds at risk or replacement cost of legacy technologies and operating models.

If we are to renew our approach to data, we need to reconcile the data pools within our organisations as well as between us and any organisation we partner in a world of distributed ledgers or shared information platforms. That is a standards issue, difficult to crack in a global market, but also a quality issue covering the integrity of static data, transactional data and information. We also need to stop mouthing calls for big data and assess the need for big valid data for many are mistaking quantum of data for quality.  The monetisation of data is a separate issue, but is dependent on these matters being resolved. I would note that in August 2003, I stood down as Chairman of the Reference Data User Group after struggling for many years to get alignment between reference data such as BICs and ISINs. It is still far from aligned!

In so far as documentation is concerned, especially in the new issue market, translating prospectuses into data elements is difficult. First, many lawyers refuse to do so as they are concerned that they could create risk by such a process given they have carefully crafted constructively ambiguous wording, at least to protect their clients, into such documents. The number of data elements needed to automate all international market (ICSD) new issues amounts to a hundred or so and the process is undertaken by the ICSDs with, no doubt, hundreds of checks being undertaken across firms all over the world. For many years I chaired ISMAG, an entity created by the ICSDs to harmonise issuance and asset servicing processes in international debt securities. The exercise worked for most in the banking and brokerage sectors but not in the hoped-for creation at source of dependable, machine readable data on issuance. The market cites risk and the need for quality guarantees as a reason not to depend on a trusted source for such data, and, by doing so, creates risk and inefficiency. Given the existing role of the ICSDs in this area and their historic lack of material error in their processes, perhaps participants could revisit the issue, allowing for the fact that a single source will most likely not satisfy regulatory bodies.

There is collaboration across many of the distributed ledger propositions but it needs to progress further and more sequentially. There is welcome progress in some basic areas. Private issuance with limited participants has already used such technology. But there is little clarity, especially around the technical hurdles of scale and complexity, as to how the concept will move from test sites and test products to the core of the value chain. It needs to accommodate transactions, holdings, portfolios and asset pools with all the dynamics that entails to ensure that there is not a front end distributed ledger supported by multiple traditional proprietary asset servicing applications. It is an ideal technology for static data distribution, albeit in areas such as standing settlement instructions or Know Your Customer, there are models of varying efficiency and accuracy operating on legacy platforms in the market. But I know of no one capable of drawing up a sensible road map on the evolution from legacy to distributed ledgers whilst taking into account some of the challenges I have already mentioned. In a global world where funds, suppliers and markets are multinational, joining the universe is critical as otherwise pools of non-participating entities, especially around quasi monopolistic industry infrastructures or end clients without commercial incentives to migrate from their status quo, will create areas of exception processing whose combined cost will destroy the economic logic of any development.

Regulation and law will need to adapt to any changes. The concept of new technologies and single data sources create two key risks that need to be overcome. The first is the law applicable to the data and the second is the acceptability of single source itself. There is concentration risk in blockchain models and there is technology and contagion risk in shared solutions. New technologies may be far more robust than legacy platforms, but the value of world capital markets is estimated at $118 trillion and the risk of cybercrime will always be present with such spoils in mind. And, as we have learnt in the past, quite often individual user gateway and IT security can be a weak point in the chain. The more efficient the new joined up environment is, the more parties it will embrace and the greater such risks. In an era of quantum computing, it would be scandalous for parties to believe that the new ecosystem is proof against greed and crime.

There is change in the air. There are barriers to overcome. There are centres of excellence being created. Potentially there will be new entrants. But the market is not risk free and the transition is complex. Scarce capital is needed for investment, and profitability per dollar of assets in the system is unlikely to rise in the foreseeable future. Equity or other long-term capital is needed for investor protection, even in the new world, and that will deter many non-banks from becoming service, as distinct from facilities, providers.

But, above all, we need a vision that goes beyond the immediate or the stratospherically philosophical. There are practical issues to overcome and some long-standing ones have proven difficult in the past whilst many of the new ones will be challenging in the future. Who can provide that vision? Some, in the Far East especially, appear to be laying the building blocks. But the old world, with its legacies, mind sets and, in many cases, lack of funding, still has the bulk of professional assets and needs to make a few quantum leaps.