By John Gubert, managing director, JSG Consulting
Some years ago, I recall an industry leader proclaiming that, in the future, technology would displace people in our industry. His clarion call was one person per billion dollars of assets. We are now in a different universe, technology-wise, than we were in those days in the early 1990s. But his vision remains valid.
So how far will machines replace people in our industry? How intelligent will they need to be? How will the standard IT and operational architecture of the survivors look? And will we move from big data to obese data? How will this fit into the financial constraints of an asset management industry that is likely to be charging 20 basis points maximum plus long term performance bonuses? I suspect the custodian of the future will have business systems and product development specialists alongside client relationship managers as the bulk of their headcount. The standard IT architecture will continue to be modular but with more components, and, as I have often suggested, with more cloud-based, cross-entity sharing of specialised modules. Pricing will reflect reality. The efficient managers will be paying under a quarter of a basis point for processing and substantially more for risk mitigation, albeit the critical risks will move away from operational shortcomings to matters around guaranteeing safety of assets and compliance with regulation. And I suspect we will also have a data revolution with value-added interpreted data replacing the bulk of raw data currently dumped around the market.
Alas, for many in the operations’ areas in our industry, machines will prevail ever more. We already use technology for the bulk of securities transaction processing. Looking at the process breaks currently requiring people, the major causes are industry dinosaurs, competitive aberrations and a huge layer of permafrost blocking progress, primarily from governments and regulators. And these are not sustainable. Examples of the dinosaurs include those among regulators who require lengthy retention of paper copies of a myriad of documents, especially in the transfer agency industry, or the legal profession with their refusal to move away from their elegantly worded, legally opaque and operationally challenging method of communicating to investors and their agents. Competitive aberrations include those who abuse standards and those, especially among infrastructure, who remove choice from users by a variety of nefarious means. And the permafrost covers that band of experts from the past, who religiously defy progress on the basis that the old ways are best.
There are some basic critical areas, where common sense still has to prevail. The most evident is in the world of communication standards. We need an alignment of trade and post-trade securities identifiers, whether or not this impairs the economic models of Bloomberg, Reuters, SWIFT and others. We need issuer-to-investor standard datasets to cover the core financial flows around primary issuance, including corporate actions and income payment. And we need universal, standard connectivity between central securities depositories (CSDs) and their users. Finally, we need to resurrect the concept of the Global Straight Through Processing Association (GSTPA), although certainly not the governance structure it adopted, to align trade and settlement matching in a seamless and consistent process. Intuitively that would allow us to migrate almost all securities flows from issuer through to end investor and their agent to a straight-through process. Exceptions could be limited to perhaps just under 1% of all flows not covered by the standards. That obviously assumes an incredible level of accuracy in the data sets moved around the markets and willing compliance from the paper-pushing brigade, who still regard technological advance as a revolutionary excess. But with a straight-through process costing perhaps under one tenth of a simple manual one, and even less of a complex intervention, the economic case will force a change from casual to disciplined compliance with industry communication standards and protocols.
But the potential for automation is not limited to basic securities operations. It is a logical component of the fund administration world. Almost the entire gambit of the controls expected of a fiduciary can be automated. They are definitely capable of a 99% plus automation target! Core fund pricing needs data feeds, comparison algorithms and a calculator, all of which are basic components of existing standard software packages. Complex pricing for funds, often derivative based, has been simplified with the move of transactions from OTC to traded markets. And adjusting valuations for hedges is merely another computation based on a limited number of data elements across almost the entire gambit of possible transactions. Cash flow monitoring, investment restrictions and leverage ratios do not need manpower. A good part of the analytical component of the administrator’s duties as well as production of balance sheets and other reports can also be automated to a large extent. And transfer agency is an area ripe for re-engineering away from paper, although help is needed from regulators to achieve a world where the identity of a person no longer depends on photocopies of a page in their passport or a utility bill.
The reality is that, technically, the bulk of operational roles could be automated. So why is this not the case? The truth is that self-preservation, legacy platforms and strung together IT architecture are the big barriers. The reality is that the established players have a problem. Their IT world encompasses a series of modules, ranging from the aged and almost archaic to the modern. Component based architecture is no panacea, though, especially if it straddles in-house platforms across multiple locations, shared platforms with other business lines and common Group utilities. Such a mix of stakeholders is not only hard to manage, but places established players at a long-term serious disadvantage over potential new entrants. New entrants will be few and far between and the future optimum operating model may well be a hybrid structure with a processing arm, a risk manager and a bank across two or more entities.
And we need to get to grips with data. There is too much data around. This is obese data. It is data that is redundant. It is produced for bureaucratic reasons rather than commercial need. It may be needed occasionally, and then it should be produced on demand and not as a routine. I have already noted the critical issue of standards and businesses need to ensure that they operate a logical data warehouse that obviates the need for multiple incidences of the same data elements, with all the structural problems reconciliation of such data creates. Reporting also needs to be tested against a value indicator. Here the regulators are the worst offenders. They have failed to adopt common standards across the different reports their regulations mandate. And one suspects that their ability to mine the tons of data they have cheerfully demanded will be manifested, to their cost, when we have the next, inevitable, major crisis.
The world of the future needs to be a low cost and high performance one. The current fee structure of the industry is unsustainable. Performance differences over time of the bulk of index trackers, smart beta or absolute value funds do not justify the differential in fees. If we are moving into a world of lower fees, we need less cost across the value chain. And technology is a good place to start! It can reduce people cost, around 60% of the total cost base of our industry. It can also reduce process breaks and enhance STP.
Unfortunately turkeys don’t vote for Christmas. And simplifying the securities services world, which is built on a series of relatively plain vanilla business components, needs less people, more standards, joined up technology and a desire for simple solutions to the simple challenges we face.