Michael Shashoua: What Last Year and This Year Can Teach Us for 2017

The lesson to be learned from 2015 is that improvements in data governance planning need to continue.

michael-shashoua-alt
Michael Shashoua, editor, Inside Reference Data

Inside Reference Data closed out 2015 with a roundtable of data management experts, seeking to identify trends and challenges likely to dominate the industry in 2016. We heard that data centralization is now the main focus of these experts and their colleagues, with debate about how to achieve that centralization still continuing.

To understand how the industry reached this point, it’s worth looking back at what experts were saying in similar interviews conducted the year before, and how their insights bore out by the end of 2015.

At the end of 2014, reference data management advances seemed likely to be incremental, if they happened at all. The development deemed most likely to occur was that data management technology would mature and its focus would center around integration of data sources, and getting firms to establish data strategies or governance plans.

At that time, we found evidence that many firms were taking on data governance challenges. TIAA-CREF had deployed an “acquisition and attrition” model. Data governance development was helping to support analytics, said Julia Bardmesser of Citi, who emphasized the importance of data standardization. Canadian firms, including TD Bank and Canadian Western Bank, had found benefits from making data governance plans cross-functional.

As 2015 progressed, the industry started to tie data governance work to addressing risk data management, and regarded data governance as a way to better handle data relevant to risk—especially to comply with risk data aggregation guidelines, such as BCBS 239. Last year started out with BCBS 239 driving changes in data infrastructure, but continued with overall readiness to comply still lagging, leaving BCBS 239 as unfinished business in 2016.

Management Methods

Looking forward, as sources from Acadian Asset Management, Chartis Research, Dun & Bradstreet, HSBC and others did for the start of this year, enterprise data management (EDM), master data management (MDM), and the influence of chief data officers are likely to figure in data centralization efforts. Chris Johnson of HSBC cautioned that using an EDM system to centralize data can reduce flexibility, while Robert Iati of Dun & Bradstreet sees consortia such as SmartStream’s SPReD service as influential in breaking down proprietary data silos, thereby facilitating centralization.

With MDM also being used to federate financial industry records, EDM and MDM will have to be harmonized, said consultant Steve Lachaga. The industry cannot be content with good data sill existing in those silos, he added. The capability to integrate and manage multiple databases is certainly available, says Hugh Stewart of Chartis Research, so the industry should capitalize on that by building an expanded data model making it possible to reuse the same data when it’s relevant for compliance reporting and risk management.

All of these ideas and potential advances are promising, but still may require leadership support to make them a reality. The growing influence of CDOs in firms’ leadership improves the chances of such projects being supported and implemented. However, since immediate operational and business demands require most CDOs to focus on data quality and adapting data operations, rather than improvements that would represent progress, according to Brian Buzzelli of Acadian Asset Management, CDOs’ abilities to lead on such visions could be weighed down. HSBC’s Johnson holds out hope, though, that the CDO role will continue to be fluid enough to hold sway on deciding the best ways to manage data and organize its collection and analysis.

Whether CDOs drive the development of data centralization or not, the lesson to be learned from 2015 is that improvements in data governance planning, which still are not complete, need to continue. Otherwise, at the start of 2017, the industry may be looking back and trying to figure out how and why data centralization efforts are stalled or have failed. 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

A tech revolution in an old-school industry: FX

FX is in a state of transition, as asset managers and financial firms explore modernizing their operating processes. But manual processes persist. MillTechFX’s Eric Huttman makes the case for doubling down on new technology and embracing automation to increase operational efficiency in FX.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here