One could be forgiven for believing that technology has become the driver for new initiatives in our business rather than the solution for business problems. This is not to undermine the critical role that new technologies and IT architecture re-engineering is playing in bringing the market closer to a 21st Century model, but we do need to heed some of the lessons of the past and ensure we do not repeat historical errors made in earlier landmark disruptive events.
My first real experience of technology as a disruptor came in two parallel developments in the late 1970s and 1980s. The first was the PC which initially acted as a dumb terminal with basic spreadsheet and word processing capabilities and progressed rapidly to a critical communication tool. But the more important one was the extreme disruption caused by dematerialisation and automation. Dematerialisation was primarily related to share certificates and automation to the launch within the new technology of an almost mirror Image of the former manual processing environment. We quickly learnt several lessons. Firstly, industry disruption leads to industry consolidation with the collapse of the Neanderthals who decry the New World order. Secondly, the volume impact of the removal of physical barriers is seriously underestimated; we had to upgrade the UK Crest platform multiple times to cope with the post-trade ‘big bang’ volume effect. Thirdly, we learnt the importance of clear and visionary specifications by the business prior to any other work on the project being started; with floating specifications killing many infrastructure and commercial developments. And finally, by defaulting apparently minor processes to manual solutions, we realised one creates potential bottlenecks in the live environment, with a knock-on effect on the efficiency of the new platform.
My second experience of disruptors came in the latter years of the 20th Century as we looked to enhance the value of automation by extending its reach and re-engineering processes. The key dynamic here was the move to a more modular IT architecture with better reporting capabilities. The key lessons learnt were that traditional data was an irreconciled mess and that legacy was going to stay. We also learnt about the problems of connectivity in the new modular world with additional single points of failure as we extended it. And standards became a barrier, especially around communication flows, as e-banking grew in scope. The standards guardians were unable to adopt change in a time frame needed. Especially in the world of securities identifiers, the bureaucracy was wedded to markets of bygone days and managed by technicians committed to purity rather than progress. Our actions also increased capacity exponentially and, by adding functionality and increasing the product revenue reach, we gained revenue flow only to start dissipating it by launching a price war that continues to rage ferociously even today.
There is a battle going on
We are now in the transformation era and all the challenges of the first two phases of development still apply. There will be serious people disruption and attrition with skills needed to be massively upgraded and numbers radically reduced. There also needs to be more attention to scoping new developments and placing them in the context of the firm’s operational and revenue universes for a downside of the sandboxes and other new production environments is that they often appear to focus on quite tangential or marginal areas to a successful business.
And we need to think more around volume implications in a world, where data becomes plentiful. Capacity will breed demand as more historical data is stored and accessible and needed for improved data analytics. We will see the adoption of new data hungry processes and time sensitive data analytics, especially with the greater use of clearing house settlement or the insatiable regulatory demand from new initiatives such as CSDR, SFR or SRD II in Europe. There is a battle going on around data supremacy and data monetisation. It is a battle that will be won by the few major firms rather than the many for it is technically demanding and investment dollar hungry with huge advantages for those with scale. Data will not drive big bang changes to IT architectures, although the front-end client facing cover will increase in scope and hide some of the creaking giants in the background. For legacy is here to stay and is going to be hard to replace as long as it can be supported technically and has the requisite capacity. But we need to be aware that the new data hungry world adds extra risks. We need to be concerned about crime risk and especially in an industry moving billions and horribly dependent on third party contractors to build the new or maintain the legacy platforms, there is a true risk of in-house corruption. Outsourcing always adds other risks and due diligence with critical application partners is becoming ever more difficult as data security is partly dependent on refusal to disclose. And in a world of ever greater connectivity, cyber security in general terms becomes an increasing issue especially as potentially malevolent State Players join the criminal fraternity.
We need to ensure that our new ideas bring genuine cost and risk advantage. Look at the most talked about development, blockchain, a distributed, decentralised, cryptographically secured peer to peer platform. How does it compare to alternative environments? We need to distinguish between its use as a general open platform for Bitcoin and the planned closed networks in financial services. The ability to deal with trusted counterparts changes the risk profile and raises the question whether Blockchain is being proposed as an alternative to a simple traditional but shared database? We also need to tackle the issue of standards. Although many would look to SWIFT to assist in this area, the perennial divergence between SWIFT in the back- or middle-office and front-end/front-office applications highlights the challenge, especially if we are moving to the apparent nirvana of a front- to back-office solution in our industry.
RPA has a familiar feel about it
The cryptographic encryptions of blockchain are secure in today’s environment but quantum computers must be working in many places to find a way of beating the system if it is to trade daily in billions rather than millions. And, in all of this, we need to recognise that blockchain retains the challenge of many linked environments, especially the perennial problem of poor access controls. We will be able to assess the value of blockchain, outside niche firm to client solutions, when it gains some limited industrial strength, whether in the commercial sector where some companies are producing Blockchain options or solutions or in the utility field. The latter will first be manifested in the ASX/CHESS transformation project, the HKEX northbound Stock Connect link to Shanghai and Shenzhen and the SGX DVP solution for tokenised assets. In other words, blockchain is still in development phase some half decade or so and a billion dollar-plus of development work since it was first proclaimed as the panacea to cure all post-trade ills.
Other developments are progressing but they are also not transformative in their own right for the industry. Much of the robotic processing automation (RPAs) announced looks very much like the “six options” rules I used nearly 20 years ago to argue for offshoring. Basically if 95% plus of a process could be executed without using discretion but by answering yes/no on a maximum six alternatives, then it could be outsourced with the problem transactions being moved back to subject matter experts for discretionary handling. The trouble with RPA, like chatbots, is that they may be used primarily to eliminate the high cost of user supplier query handling. They have a role but the architectural decision should be for the business and not the bean counter. The linked issue of artificial intelligence has its uses but regulators remain nervous about controls. There is enough research indicating the potential for AI contagion or even abuse for it to be used only when management is willing to accept the consequences of dependency around the algorithms driving the application. Tokenisation, using much of this new technology, is, as mentioned for the SGX development, gaining traction. Some examples can fit today’s processing model and are being commingled with those that do not fit in an attempt to create the requisite volume for different business cases.
The excellent International Securities Services Association (ISSA) paper on crypto assets highlights the three main groups of payment tokens, securities tokens and utility tokens with the securities tokens being sub-divided into asset backed and digital linked. I would suggest the real value of tokenisation lies in solutions around the custody of genuine digital currencies and assets held by trusted intermediaries including the troublesome issue around handling private keys.
When I started in the securities industry in the late 1980’s, the post-trade arena was unregulated, largely domestic, international was a few third countries, settlement was for the account and the vault was the inner sanctum of the business. When I took my first retirement a few months before Lehman, we were custodians, depositaries and fund administrators, with strong cash and capital market competences, operating globally in a multi-instrument world. Since then all has become more complex, more risk has been adopted and the only thing to have reduced is fee income per dollar of assets. Change is admirable but I have tried to show here that the business needs to be in charge and ensure that change adds value to the client, reduces risk and allows us to make that 50% productivity improvement needed for survival without breaking the bank.