As Data Costs Rise, Rethinking What Really Matters (By Matthew Timberlake, ITG)
As traders compete for liquidity across venues and asset classes, access to quality market data has never been more critical. Being able to consistently obtain accurate and stable price information has a huge impact not only on the success of a single transaction but on the broader success and profitability of the entire trading desk. Market data providers know this well and tend to charge accordingly.
As a result, market data is pricier than ever, having increased steadily over the past decade. It is not just due to increased transaction volumes; Exchanges are increasingly turning to market data as a way to boost flagging revenues amid unprecedented challenges to their core businesses. In many cases this has led to an uptick in costs, not so much for the data itself—which is sometimes offered free of charge as part of an initial promo period—but for the often unavoidable fees associated with steps in handling that data, such as auditing. As part of data agreements, exchanges have the right to evaluate and charge for the various ways their market data is used, a process which itself carries a cost.
This can make it difficult for firms to separate the wheat from the chaff when it comes to cost savings. Any savings realized from consolidating data sources and terminals, for example, may be offset by fees for black box use and/or any number of related processes.
While there have been rumblings about fair and equal access to market data, the industry still has a long way to go in terms of making any unified effort to address rising data costs. There has been little guidance on the regulatory front as well.
With pricey market data likely here to stay, it is a good opportunity for the industry to revisit the basics about what makes market data valuable:
When it comes to weighing the most important features of market data, stability should be at the top of everyone’s list. Given the current volatility picture, one of the most valuable things market information can do is offer traders the peace of mind that “what you see is what you get.” A quote may have been extremely reliable at one point, for example, but if begins to disintegrate before you can act confidently on it, it matters little how accurate it once was. High quality data should always reflect real-time market conditions and be sufficiently stable to allow traders to act confidently on that information.
Speed is another key feature of high quality market data; it is a major way that exchanges and other data providers strive to differentiate themselves and justify higher costs. The explosion of low-latency and high-frequency trading has put pressure on firms to have access to actionable market information within the fastest time possible. To help facilitate this, many exchanges will sell access to their direct data feed as an expensive alternative to the slower, centralized SIP feed that is the standard for much of the industry.
As data costs continue to surge, we expect to see the industry move towards a new structure in coming years, one that allows consumers of data to be much more selective about the specific types of data they receive and how they pay for that data. A fairer, more granular price model would be at the center of such a movement.
Matthew Timberlake is a Managing Director with ITG.
With Adam Conn, Head of Trading, Baillie Gifford
Clients will have the ability to interact with a larger liquidity pool while minimizing market impact.
Buy-side firms can discover liquidity more efficiently and execute on Turquoise.
Appital is also partnering with Turquoise to bring bookbuilding technology to the buy side.
Flowlinx simplifies block liquidity search in emerging and frontier market equities.