Does One Data Silo Fit All?

matthew-cox

Even as they seek transparency into data sources, and try to integrate data from disparate sources into a federated model or a single golden copy, financial services data professionals are also increasingly looking at how to centralize data across data silos in its own right, according to attendees at the European Financial Information Summit sponsored by Inside Reference Data and Inside Market Data.

Data consistency remains the biggest obstacle to getting a global central data utility that can collect data one time and distribute it throughout a firm, said Matthew Cox, head of securities data at BNY Mellon. “It’s not the silos that cause the issue, it’s the consistency and the way people take in and interpret data,” he said.

Silos are a natural result of how the business works, observed Stephen Engdahl, New York-based senior vice president, product strategy, at GoldenSource. Too much federation of data has its risks, he said. “If you don’t understand thoroughly where the governance process comes in, what the quality of each of the underlying data scores might be, there’s a risk of pulling it out of a silo and using it for other purposes that it may not be fit for,” said Engdahl. “You might discover that data you are using for very critical decisions was not looked after at the level of quality you needed.”

Central Difficulty
Firms are trying to calculate exposures quickly, and therefore need data centralized, as well as identified. Both identification and centralization can help overcome issues caused by siloed data. “If you don’t have an identifier for data you have produced, then you really are struggling from step one,” said Michael McMorrow, principal at MMM Data Perspectives, who was a longtime enterprise data executive at Allied Irish Banks at the time he spoke.

The silos can also be caused by the need to trust multiple data providers, observed Cox. “We all accept that one data vendor can’t supply all the information,” he said. “But from what I’ve seen over the years, from a technology perspective, we’re always looking for one system to do everything. We should open our mind to the idea that one technology platform can’t be expected to meet all the needs of an organization. We should think of taking the technology that’s available to suit the needs, but be able to interlink it.”

However, the more data sources, the more difficult it gets to link them all together, said Engdahl. Firms should define the steps needed to get to centralized data through linked sources, and focus on the flaws in that process. “You need that vision of where you want to go,” he said. “Otherwise, you’re just propagating the same things we’ve always had.”

Automating data management can “bury” firms in codes that make the data almost impossible to trace, according to McMorrow. Firms also lack understanding of what data attributes are needed in a particular context, said Alexander Loesch, head of instrument static data, at Landesbank Berlin.

With every data vendor tending to use different methods to collect data, an understanding of the rules and formats becomes even more elusive, noted Mark Lindup, director and head of contributions at Fitch Solutions.

Truly Real-Time?
The timeliness of data and the reliability of valuations have also become an issue for accurately calculating exposures. Taking a valuation from the market close of the day before is no longer good enough, said Nick Murphy, product owner at Asset Control. “You’ll get much more accurate pricing when you get valuations at some point in the morning, as close as possible,” he said. “It’s moving in that direction, but it’s a far cry to instant tick-by-tick or instantaneous data coming in.”

Real-time data may not be all it appears, however, because if you discover something in real-time data, said Engdahl, “you can’t transact, execute a trade and get out of a position that quickly before that data point changes.” The data, realistically, is snapshots taken at specific points intra-day and at market close, he said.

More specifically, one must understand what data is being updated, according to Cox. “In a quiet market where there’s very little market trading, that valuation may be two to four months old,” he said. “That’s just as important to know as it is to have data that’s right up to the minute.” Cox stressed that accuracy should trump timeliness, and that it is better to distribute accurate information.

By performing more careful analysis and obtaining more accurate data as a result, said Engdahl, firms should “be able to react and get into and out of positions quicker.”

Transparency and Volatility
Also, firms should pursue data improvements for their own sake, not just as a result of regulatory mandates, added Engdahl. “Regulations may become unclear, deadlines may be pushed out and regulations may change as political or economic environments change,” he said. “It helps make the business case, but there are internal drivers and business benefits for your own efficiency.”

Transparently sourced data can work in tandem with centralized, more accurate data, to increase credibility with regulators, explained Cox. “If we ask vendors for the underlying transparency, we understand what they’re giving us,” he said. “Being able to take that data, understand what’s being done and being able to pass it on to clients to give them the valuations—regulators press for that.”

Compliance requires transparency in the form of being able to deliver the source for data and the rules that produced the data, according to Engdahl. “It means taking the granular and detailed information and providing it in a way that you can analyze it,” he said.

Of course, volatility can throw off the best-prepared data centralization, transparency and consistency efforts. Linking data sources increases preparedness to handle volatility, explained Cox, and doing so is potentially “the next big thing” in data management, he said. Volatility can appear in several forms, including regulatory requirements and tensions between large long-term programs that ought to handle any issues and tactical capabilities to address specific data issues, according to McMorrow.

“It’s hard to balance between data management practices that will be better able to react more efficiently,” he said. “There is a constant tension between doing tactical things or reactive things, and planned things.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Removal of Chevron spells t-r-o-u-b-l-e for the C-A-T

Citadel Securities and the American Securities Association are suing the SEC to limit the Consolidated Audit Trail, and their case may be aided by the removal of a key piece of the agency’s legislative power earlier this year.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here