Opening Cross: The Longevity of Latency as Smarts Outpaces Speed

Though data latency attracts a lot of attention because of its necessity for algorithmic—not just high-frequency—trading, it isn’t the only game in town. And because it is bounded by physical limits—i.e. the speed of light, or whatever is faster than light, for when we find a way to transmit data by some other means—it has a limited shelf life for delivering competitive advantage, compared to inputs that might yield more value, long-term.
Meantime, the low-latency marketplace continues to grow—by 1.5 percent in 2012 and 4.5 percent over the next three years, according to Tabb Group, which places current sell-side spend on data distribution technologies at $3.6 billion.
And beyond the most liquid, exchange-traded asset classes already straining the limits of latency, the over-the-counter markets have a long way to go before they exhaust the potential for latency reduction. But firms are already applying low-latency technologies in these markets, and will surely expand them to “low-frequency” asset classes as the dynamics of those instruments change due to shifts toward centrally-cleared venues and as investors seek assets with higher potential returns.
This could prompt institutional traders to desert unprofitable equity markets completely for OTC assets, contributing to the rapid evolution of those markets and increased data demand, but having the reverse effect on exchanges, which would need to leverage other business models to maintain revenues, such as increasing their focus on derivatives trading, clearing and—as BT’s Chris Pickles suggests in this issue’s Open Platform—being a neutral “messaging hub” between markets and participants.
This would free up equity markets to fulfill what some argue is their true role—enabling companies to raise capital, rather than being barometers of short-term volatility—and increase their appeal to long-term investors concerned about being outpaced by high-frequency traders.
With a different makeup of participants, exchanges may also have to provide more data free of charge for lower-end investors—not an appealing prospect, as data revenues grew in Q1 while overall exchange revenues fell. However, they could offset any losses by leveraging their central position as aggregators of liquidity and information to capture more data, translate that into new types of datasets and signals, and charge a premium for it. Demand is growing for exchange-like data on OTC asset classes, such as the Datavision Streaming tick-by-tick data service for OTC credit instruments launched last week by credit specialist CMA based on prices from market participants, or Benchmark Solutions’ streaming market-driven pricing, or even transaction cost analysis for markets like currencies—such as that launched last week by agency broker ITG—which could provide an additional input for decision support.
And factors currently used to assess risk could be applied to create new trading indicators. For example, risk and portfolio analysis tools provider Axioma last week presented its quarterly risk review, revealing lower risk and volatility levels across global markets—except China—in Q1 than in the previous quarter.
One way to reduce risk is to diversify by minimizing correlation, since a “diverse” portfolio of stocks that behave similarly is not really diverse at all, while futures prices may not accurately reflect underlyings because of factors priced into the future—for example, oil futures include the cost of transportation—so Axioma creates synthetic prices based on other factors affecting an asset that more accurately reflect its value. Though designed to support long-term decisions, rather than tracking intraday price movements, why couldn’t these models be used to create real-time synthetic prices that expose market inefficiencies in future?
So, will low latency become less important over time? No, because it becomes the benchmark rather than the cutting edge, and because all the high-performance technology providers will be crucial to calculating valuable new data inputs in a timely manner to meet that benchmark.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Orchestrade resists SaaS model in favor of customer flexibility
Firms like Orchestrade are minimizing funds and banks’ risks with different approaches to risk management.
Hyperscalers to take hits as AI demand overpowers datacenter capacity
The IMD Wrap: Max asks, who’s really raising your datacenter costs? And how can you reduce them?
New FPGA component aims to curb co-lo costs
Hardware ticker plant provider Exegy is working on a new FPGA solution that it says will free up costly processing power on firms’ existing co-lo servers.
Market data woes, new and improved partnerships, acquisitions, and more
The Waters Cooler: BNY and OpenAI hold hands, FactSet partners with Interop.io, and trading technology gets more complicated in this week’s news round-up.
Asset manager Fortlake turns to AI data mapping for derivatives reporting
The firm also intends to streamline the data it sends to its administrator and establish a centralized database with the help of Fait Solutions.
New study reveals soaring market data spend led by trading terminals
The research finds that 2024 was a record year for overall market data spend, supported by growth in terminal use, new license schemes by index providers, and great price variation among ratings agencies.
The murky future of buying or building trading technology
Waters Wrap: It’s obvious the buy-v-build debate is changing as AI gets more complex, but Anthony wonders how trading firms will keep up.
‘I recognize that tree’: Are market data fees defying gravity?
What do market data fees have in common with ‘Gilmore Girls’ and Samuel Beckett? Allow Reb to tell you.