Data Quality: Whose Responsibility is it?

One component in the far-reaching efforts to redress the failings of the financial system post-crisis is the focus on improving data quality, write Mark Davies and Bill Meenaghan.

By Soapbox

One component in the far-reaching efforts to redress the failings of the financial system post-crisis is the focus on improving data quality. In 2008, in response to global financial crisis, the Financial Stability Board’s (FSB) Senior Supervisors Group (SSG) sponsored a new counterparty exposure data collection programme to measure improvements in market participants’ ability to produce accurate and timely counterparty information.

Worryingly, the group’s recent report, addressed to Mark Carney as chairman of the FSB, finds that progress to date fails to meet supervisory expectations and in particular highlights data quality as an area of concern, stating that; “Recurring data errors indicate that many firms are below SSG benchmark standards for data quality and cannot measure and monitor the accuracy of the data they submit or rectify quality issues in a timely manner.”

The lack of progress towards a wholesale improvement in data quality is worrying given its criticality to client on-boarding and Know-Your-Customer (KYC) requirements; trade settlement and reporting; and risk calculations. Any flaws in the information that feed into these processes will result in unexpected outcomes or affect the accuracy of risk assessments – neither of which regulators or market participants want.

Improving data quality

Incorrect data – at the point at which it becomes incorrect – is not always a result of negligence; over the normal course of business, firms change their company information which renders counterparty data out of date. But the problem of poor data has grown over time through insufficient maintenance, excessive duplication and dormant records. Given the potentially thousands of counterparties that some institutions have on their books, maintaining one version of the ‘truth’ is hard enough. When you consider, in addition to this, the complex architecture that many firms run as a consequence of mergers and product, business or geographic silos, it becomes almost impossible to separate good data from bad. Simply keeping abreast of the vast amount of counterparty data which needs to be managed is challenging, and requires dedicated resources, which are often in short supply.

But irrespective of these challenges, the problem needs to be addressed and, importantly, data must be maintained on an on going basis. The SSG notes that data errors over the past five years have not diminished proportionally to the adoption of automated capabilities but it is our view that automation alone cannot guarantee data accuracy. It requires dedicated research and expertise; knowledge of where data can be sourced and checked; the ability to adapt to hundreds of languages; and ultimately, human oversight to find definitively what the correct information is.

Broadly speaking, market participants are aware of (and frustrated by) data errors that impact business processes. In research conducted by Omgeo, nearly half of the banks surveyed said that 30% or more of trade fails were due to settlement instruction issues, potentially caused by incorrect underlying data.

And in terms of market participants’ ability to produce timely counterparty information, in a separate survey, 60% of respondents cited the need for counterparties to submit trade details in a more timely fashion as the area requiring most attention when it comes to achieving faster trade settlement times.

The market, regulator and providers’ role

Ultimately, it is the responsibility of individual financial institutions to make sure that the information used in business processes and in meeting compliance requirements is accurate. But we see two avenues whereby this burden can be eased. Firstly, through greater involvement from the custodian community in helping buy-side firms maintain and update data: this view is shared by the custodian community, with 63% of banks agreeing that they should have more involvement.

Secondly, and following a wider trend we are seeing in the financial markets, all market participants should now be looking to capitalize on opportunities for a more collaborative approach to managing data. Sharing data cleansing and maintenance efforts across the industry via a centralized utility increases efficiency and drives quality improvements through shared expertise.

But regulators have a role too. Regulatory mandates, in all areas of the financial markets, can help to drive and harmonize behaviors. Currently there are no guidelines relating to how frequently data should be checked, updated and verified. While introducing mandates would be difficult to monitor and enforce, regulators can help contribute towards improving data quality by driving the introduction and implementation of standards, for example, extending the efforts that have been made around the legal entity identifier (LEI).

Much progress has been made towards the creation of a global LEI system, a standard that will act as a “bar code” for financial services firms in order to improve risk monitoring and mitigation. The next step, although this is likely to take time, is to standardize the underlying data, which feeds into the LEI system. Achieving data standards across all areas of the financial markets, including reporting, KYC and settlement will be resource and time intensive, but necessary.

Finally, providers of data solutions and market infrastructures should continue to help firms by striving to develop new ways of easing the burden of data maintenance, and making this accessible to all industry participants in a cost-effective way.

What is absolutely critical is that the market, regulators and providers keep data quality top of mind because the message from the SSG is clear: in the five years since the global financial crisis, there has been little progress towards an improvement in data quality. Only when we are dealing with the correct information can we hope to overcome the failings of the last crisis. If we can, as an industry, set our sights on establishing and implementing a robust, streamlined process for maintaining data quality across all firms, we will promote the safety and integrity of the financial markets.

– by Mark Davies, head of Avox and Bill Meenaghan, global product manager, Omgeo ALERT.

 

«