Latency: It’s About Data
Latency is often thought of as a gating factor in accessing a pool of liquidity and executing a trade, but this is just one dimension of latency. Another, and possibly more important dimension, is the speed with which data can be ingested and analyzed to determine whether or not to get in or out of positions.
“The trade-offs that we have had to address over the years were not just about how to trade a position more quickly than someone else, or access a pool of liquidity more rapidly than a competitor, which is often what latency means to most people,” Philippe Burke, portfolio manager and founder of Apache Capital Management, told Markets Media. “What’s becoming as important now, certainly for us, is the ability to access large amounts of information and analyze it quickly to arrive at an actionable trading signal that might give us an edge.”
A few years back, the input problem was simple: information was mostly well-behaved numbers that could be stored in tabular format for easy retrieval and analysis, and adding more information was a simple matter of scaling up by adding more table columns and rows.
This task has become more complex: the information sought today is qualitatively more diverse, including not only numbers but also text, moving images on a screen, temperature color readings, and machine generated simulation results. Also, the volume of all this information today is both much larger, and increasing at an accelerating pace.
“This change in the quality and quantity of input information has forced significant modifications in the architecture of databases and systems to enable the storage and retrieval of now far more diverse an voluminous data sets, and also the rapid analysis of this information to arrive at a usable trading signal–the ‘fail often fail fast’ model–using in some cases newer statistical tools, including machine-learning methods,” Burke said.
In the early days of high-frequency trading, a lot of the effort consisted in deploying expensive hardware to access pools of liquidity more quickly than someone else in order to execute more profitably than your competitors. The return on this “arms race” has dropped materially today, and a part of the industry’s effort has now shifted toward accessing and analyzing larger and larger data sets.
Convergex’s U.S. Equity Market Structure “Flashback” Survey, exploring the concerns and actions of financial industry participants regarding high frequency trading, reveals an investment community with remaining concerns about U.S. equity market structure, yet feeling more positive than one year ago.
A majority of respondents (57%) still say that markets are not fair for all participants, down from 70% who said the same thing in April 2014, and more than one-third (36%) describe HFT as “harmful” or “very harmful” to investors, versus fifty-one percent (51%) in the previous survey.
“Wall Street’s perception of markets has clearly shifted,” said Eric Noll, Convergex president and chief executive officer, in a release. “Today’s market structure is complex and challenging. Investors are more comfortable now than they were a year ago, but they’re still largely unsure of how this impacts them and what changes they should make to the way they trade.”
Adds Burke, “Today, for us, It’s much more about how fast I can retrieve and manipulate large data sets to come up with a useful recommendation. In that process, you often face trade-offs. For instance, obtaining higher precision on a trade recommendation may only come with more analysis and end up being too time-consuming to be actionable. Or some techniques may raise your odds of finding a high probability trade, but in the process force you to forgo explanatory insight.”
Prior to founding Apache Capital, Burke was a principal and head of global leveraged Strategies at Morgan Stanley, and previously had been at Lehman Brothers, where he was a senior vice president and proprietary trader. New York-based Apache Capital Management manages client capital as a regulated hedge fund.
Featured image by djvstock/Dollar Photo CLub
Algorithms have become more prevalent in the spot FX market.
QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.
Breaking data silos is key to deploying automation beyond 'nuisance' orders.
They can be used on quantum hardware expected to be available in 5 to 10 years.
Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.