Aggregate to Accumulate
Risk data aggregation and the challenges in how to divide up data and who is assigned what pieces of data.
Kate Toumazi, Global Head of Risk Data Services, Thomson Reuters, explains how firms should deal with enterprise-wide data that creates complications for dividing or aggregating.
1. How should data dictionaries or definitions be established as a foundation for data aggregation efforts?
In accordance with Basel, financial institutions must have an enterprise approach to how they manage their risk and have a robust system which utilizes consistent data across the entity. It is without question that having a strong data architecture is critical for risk data aggregation, and that a key facet of any firm wide data architecture is having consistent data dictionaries. However, the reality is that for most firms the technical challenges are compounded when differing data dictionaries are used across a firm. In an ideal scenario, firms would pick a best in breed dictionary and look to roll this out across their entire enterprise. This may mean tweaking existing capabilities such that a broader array of data can be harmonised into a single and more scalable model. A recent survey of globally systemically important banks (G-SIBS) further highlighted the challenge firms are facing, when it showed an increase in the number of banks who are unlikely to be compliant with BCBS 239 implementation by the 2016 deadline. In fact, more than half of those surveyed said they are not going to be ready. This truly underscores the complexity of the challenge, which is growing not shrinking and a need for a solution remains critical. We can all hear the regulatory clock ticking and firms need to work towards the best viable solution for their business given their existing infrastructures.
2. Who should the stakeholders be and what should their roles be, when assigning responsibilities for data domains?
To comply with Basel, firms must be proactive in how they provide governance and oversight to their risk systems, policies and procedures. They must truly own how they are measuring and mitigating risk. In light of this, one of the biggest organizational changes we have seen across numerous firms is the appointment of a Chief Data Officer who reports to or operates for the board. We believe this trend will continue for the following reasons firstly, by elevating the importance of the data function within the organization, firms are highlighting the strategic importance of getting it right. Secondly, and perhaps more importantly, it specifically assigns accountibility to a senior individual. It is clear that the stakeholders for risk data aggregation sit across numerous parts of the organization including risk, finance, IT and data operations and that these functions must all work together to create the overall structure and composition of the governance and delivery organization. Front office and back office are often not joined up and the front office specifically is often not incentivized to input accurate data which results in manual interventions later to correct the data. By having a single senior figure responsible for data across the organization many firms are looking to address these problems and are far more likely to succeed in spite of the fragmentation.
3. Can enterprise-wide data be broken down, scrutinized and reorganized to address risk management? How should that process work?
A bank should be able to generate accurate and reliable risk data to meet the necessary reporting requirements. To accomplish this, data should be aggregated on a largely automated basis to minimize errors.
Only with an enterprise wide view can the data be aggregated to truly address risk management. Fragmentation is public enemy number one when it comes to aggregated risk management. That does not necessarily mean the only solution to be able to scrutinize the data is a single granular data repository across the entire firm with a single risk management system feeding off this . We, for example, see many banks looking to technology solutions to create a federated model for a single data repository that will add a layer over and above their existing databases to try to create a single data model mid-way through the data lifecycle.
How far banks need to go towards this depends on how consistent their data models are and where they are looking to aggregate their risk either by country, by region, by group or some other level. Even if silos of data are not being physically broken down, one thing is certain, the way collection, storage and maintenance of the data is managed can no longer be done in a silo if firms are to fully address their risk management challenges.
4. What impact is the stress testing regimen of CCAR and BCBS 239 having on risk data aggregation efforts?
The Basel Committee on Banking Supervision states "risk data aggregation" is "defining, gathering and processing risk data according to the bank's reporting requirements to enable the bank to measure its performance against it risk tolerances/appetite". BCBS 239 is core to this statement and its specific data standards highlight the vital role data plays when implementing true Risk Data Aggregation.
The biggest impact we are seeing is increased investment in data aggregation. It goes without saying that post 2008 most major institutions were looking to see how they could improve their aggregation to avoid the same lack of transparency and inability to respond on a timely basis to market and credit risks, but today's regulations are adding the extra pressure. The fact that BCBS 239 has milestones requiring firms to report on their progress also means this has been top of the agenda.
The other facet of the regulations that is different to what may have been in place before is the explicit requirement to provide forward-looking assessment of risk to senior management. This includes forecasts or scenarios for key market variables and the effects on the bank, providing senior management with a much needed view of the likely trajectory of the firm's capital and risk profile in the future. This change adds yet another layer of complexity to what is already a substantial undertaking. It also drives further investment into the data aggregation efforts to ensure that not only are historic / current risk calculations and measures consistent but any future looking views are also modeled in a consistent way.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
In 2025, keep reference data weird
The SEC, ESMA, CFTC and other acronyms provided the drama in reference data this year, including in crypto.
Asset manager Saratoga uses AI to accelerate Ridgeline rollout
The tech provider’s AI assistant helps clients summarize research, client interactions, report generation, as well as interact with the Ridgeline platform.
CDOs evolve from traffic cops to purveyors of rocket fuel
As firms start to recognize the inherent value of data, will CDOs—those who safeguard and control access to data—finally get the recognition they deserve?
It’s just semantics: The web standard that could replace the identifiers you love to hate
Data ontologists say that the IRI, a cousin of the humble URL, could put the various wars over identity resolution to bed—for good.
The art of communication: Data pros need better messaging
As the CDO of a tier-one bank puts it, when there’s an imbalance in communication between the data organization and the business (much less other technology heads) “that creates problems.”
Does TP Icap-AWS deal signal the next stage in financial cloud migration?
The IMD Wrap: Amazon’s deal with TP Icap could have been a simple renewal. Instead, it’s the stepping stone towards cloudifying other marketplace operators—and their clients.
T. Rowe Price’s Tasitsiomi on the pitfalls of data and the allures of AI
The asset manager’s head of AI and investments data science gets candid on the hype around generative AI and data transparency.
Waters Wavelength Ep. 298: GenAI in market data, and everything reference data
Reb is back on the podcast to discuss licensing sticking points for market and reference data.