The last few years have seen the industry talking a lot more about the importance of data in developing new products and services. But evidently not enough investment has gone into actually delivering high enough quality data to make these things happen and to keep out of the industry spotlight for the wrong reasons.
If we need proof of this, we need only look to last month’s $400 million fine for Citi by US regulators for what they described as “several longstanding deficiencies” in its data governance and risk management practices.
It isn’t the first regulatory fine faced by the bank for data problems (remember the Bank of England £44 million fine late last year for capital and liquidity management failings also related to data quality, for example), and for the industry as a whole, it probably won’t be the last, given the intense focus on data quality within regulatory quarters over the last few years. Citi is unlikely to be alone in its regulator-identified failings around data management and governance—there are plenty of firms out there that have similar problems, albeit at different scales. Good risk management is predicated on high quality, reliable data—like so many other things that are core to financial institutions’ businesses. Compliance? Needs good data. Next generation technology like machine-learning adoption? Yep, also requires normalised, easily consumable and accurate data. The list could go on.
So why are so many firms struggling with the integrity and reliability of their data? As an industry, we’ve failed to properly embed responsibility for data into the DNA of our organisations. We’ll always have data silos to contend with—no matter how often you centralise your data architecture, there will always be new data sets and business changes such as M&A that will add another silo or 20 to the equation. So, if we can’t reach a nirvana of consolidated, centralised data management in the traditional sense, what can we do to change? Technology is only one side of the coin; digital maturity also requires operational and cultural change.
We’ve seen more and more chief data officers on the scene at custodians and their clients over the last few years, but just because they have a ‘C’ in their title, doesn’t mean they automatically have the power to transform an organisation’s attitude toward data governance and data management. That takes support from the entire C-suite and from the various lines of business. Successful data governance and stewardship requires firms to understand their data integrity issues from the business perspective and to provide an incentive for the business to take ownership of their data assets. Now, this is a far from simple task and we’ve seen some CDOs exit the building quickly because of the lack of support across their institution. It takes time and it takes effort, but it is a necessary step to avoid ending up in regulatory hot water and facing heavy financial penalties and the reputational hit of getting caught out by clients and regulators across the globe. And the network effect of extraterritorial compliance requirements and global organisations can make those penalties very painful indeed.
To help firms understand where they are in terms of digital maturity, I have spent the last few months working on a couple of assessments to enable firms to benchmark their progress against their peers. Earlier in the year, I did this with corporate actions and now I have turned to data strategy and reconciliation as two areas of importance to dealing with data on a day to day basis.
You can access the assessments via the below links – you get an assessment at the end of the seven-minute series of questions. All data is being kept confidential and any results will be published in aggregate. The intent is to produce benchmark data across the industry – something we are sorely lacking!
Here is the one for reconciliation: click here.
And here is the one for data management and data strategy: click here.
I’d love any feedback you have too and do let me know if you have any questions.