Opening Cross: Even When Latency Isn’t Crucial, It’s Still Crucial

Loathe latency? Sick of speed? Tired of timestamps? Too bad! The markets’ love affair with low latency is as hot as ever, and if you don’t get skin in the game, you risk being too slow to compete with the low-latency Lotharios and finding yourself alone at the bar (or exchange) with no one to talk to (or trade with).
Yes, it’s becoming harder—and for each increment, more expensive—for firms to differentiate themselves purely through latency as data speeds approach the physical boundaries of the laws of physics. But that doesn’t mean the issue will go away: whatever the next big differentiator turns out to be, each link in the chain will still have to attain and maintain maximum efficiency to keep everything else flowing smoothly.
At the heart of these efforts are the exchanges themselves, which are responsible for capturing quotes and trades and turning them into the data that drives the market. For example, Canadian exchange group TMX last week announced the completion of its TMXnet GTA (Greater Toronto Area) ultra-low-latency connectivity network between datacenters and firms inside the city of Toronto with roundtrip latency of between 300 and 400 microseconds (IMD, Nov. 6, 2010).
But while it’s important that any system or network perform at low latency once in production, rigorous latency testing beforehand—in the same way that one would test all other aspects of any new system before rolling it out—is crucial to a successful deployment. Therefore, after last year announcing that it would roll out a new, low-latency trading system, the Singapore Exchange is now making sure the system lives up to its billing by testing it—and the new co-location facility in which it will reside—using latency monitoring technology from Corvil.
Meanwhile, low-latency data and trading technology vendor QuantHouse is using Greenline Financial Technologies’ Latency Monitor product to manage latency over its QuantLink order-routing network, using the non-intrusive tools to generate real-time latency views without distorting the measurements, officials say.
Despite these best efforts, market data rarely flows as smoothly as everyone would like. One of the biggest obstacles is the sheer volume of data being funneled into the market. Last week, the Financial Information Forum announced new record data peaks for several exchange feeds, while comments by the European Union’s energy chief about the potential for nuclear disaster in Japan amid other market, macroeconomic and political concerns drove US equity and derivative data rates to new heights of 4.12 million messages per second, according to hardware ticker plant vendor Exegy’s MarketDataPeaks.com website.
But since only a fraction of these messages are actual executions compared to short-lived quotes that some argue do not provide genuine liquidity—forcing exchanges, vendors and consumers to build in additional headroom at vast expense to handle volumes of data that offer no value but which threaten to disrupt their operations if they don’t have sufficient capacity—a truly optimized market will need to include truly optimized trading and quoting activity.
This could involve ever-more sophisticated smart-order routers, or the use of analytics designed to identify liquidity (such as the Market Quality Reports that Transaction Auditing Group produces for Europe and now Canada) or to identify optimal times for intraday trading (such as the new features being incorporated into technical analysis provider TraderMade’s Maverick product), or even that incorporate other indicators such as trade ideas like those provided by YouDevise, which has acquired rival First Coverage.
But for each of these to function properly, maintaining the lowest latency for data delivery and response will still be key, even if latency itself is no longer seen as the main differentiator.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Trading Tech
Market data woes, new and improved partnerships, acquisitions, and more
The Waters Cooler: BNY and OpenAI hold hands, FactSet partners with Interop.io, and trading technology gets more complicated in this week’s news round-up.
Asset manager Fortlake turns to AI data mapping for derivatives reporting
The firm also intends to streamline the data it sends to its administrator and establish a centralized database with the help of Fait Solutions.
The murky future of buying or building trading technology
Waters Wrap: It’s obvious the buy-v-build debate is changing as AI gets more complex, but Anthony wonders how trading firms will keep up.
FactSet lays out trading roadmap post LiquidityBook deal
The software and data provider announced it was buying LiquidityBook this month, filling a gap in its front-office suite of solutions.
BlackRock tests ‘quantum cognition’ AI for high-yield bond picks
The proof of concept uses the Qognitive machine learning model to find liquid substitutes for hard-to-trade securities.
The future of trading takes shape
The future of trading across the capital markets and the drivers likely to shape the ever-evolving industry
On GenAI, Citi moves from firm-wide ban to internal roll-out
The bank adopted three specific inward-facing use cases with a unified framework behind them.
FactSet-LiquidityBook: The buy-side OMS space continues to shrink
Waters Wrap: Anthony spoke with buy-side firms and industry experts to get a feel for how the market is reacting to this latest tie-up.