09.25.2012
By Terry Flanagan

HFTs Adjust to New Realities

Firms that employ high-frequency or algorithmic trading strategies are facing up to the new realities of the marketplace, which impose restrictions governing the testing of algos and controls over those who actually deploy the algos whether on their own or the firm’s behalf.

The European Securities and Markets Authority (Esma), the pan-European regulator, has published guidelines on automated trading, and other regulators plan to follow suit.

“High-frequency trading firms, algo providers and DMA [direct market access] providers are quite obviously the target for these guidelines,” said Matthew Coupe, sales director at Redkite Financial Markets, a provider of market surveillance software.

“The document is a good stop gap in tying up some of the loopholes that MiFID created [in 2007], in terms of making sure all flow going through to the market has appropriate controls and processes in place,” Coupe said. “While all firms have pre-trade risk controls in place, having appropriate and efficient market is a key area of focus.”

Knight Capital’s software mishap and the ensuing jitters have caught the attention of regulators across the globe from the U.S. and Europe to Asia producing proposals and guidelines for algo testing.

Esma has issued guidelines for investment firms that operate algos to “develop testing methodologies for new algorithms that might include performance simulations/back testing or offline testing within a trading platform testing environment”.

The Hong Kong Securities and Futures Commission (SFC) has also proposed that “trading algorithms will operate as designed…taking into account foreseeable extreme circumstances and the characteristics of different trading sessions…Deployment of the algorithmic trading system and trading algorithms would not interfere with the operation of a fair and orderly market.”

Louis Lovas, director of solutions at OneMarketData, a provider of tick data management solutions, said: “Regulators across the globe are lecturing on algo-testing showing a firm grasp of the obvious, but completely miss on the nuances.

“Algorithms, essential to the sustainability of the business, are born out of the mathematical ingenuity of quants and developers, and their complexity, is accelerating due to increasing competition,” Lovas said. “Testing them is focused on two main areas: robustness and profitability.”

Investment firms, from quant trading shops to long-only asset managers, understand the importance for strategy testing only too well.

“These algorithms are complex enough that it takes a seasoned practitioner to understand them just in isolation, and nobody understands what happens as they interact with each other,” David Lauer, a former high-frequency trader, told a Senate Banking Committee hearing last week.

The “Great Quant Meltdown of August 2007” was “an ominous harbinger of what was to come,” Lauer said. “Many people aren’t even aware that there was significant turmoil in financial markets in early August, 2007 because of unexpected behavior of long-short quantitative equity funds.”

However, most market participants have stopped short of calling for regulators to test algos themselves.

“I don’t think that would help,” said Coupe. “It would require the regulators to employ quants analysts, and then you would have situations where people working for regulators would leave and start trading themselves.

“[The regulator] should make sure that all market participants should have appropriate pre and post trade controls in place to observe their trading behavior. One of the greatest areas of issue at the moment is within the regulators themselves, where most of their systems run off transaction reports, which means they will never be able to detect market manipulative practices, as they do not look at orders being submitted to the market.”

Increased surveillance, both from regulators and by individual firms, presents huge challenges from a technology and data management perspective.

“Surveillance engines need access to a consistent and reliable source of market, trade, position and reference data, in order to correlate trading behavior with market events and to understand the complex relationships underlying beneficial ownerships,” said Coupe.

Related articles

  1. Algorithms have become more prevalent in the spot FX market.

  2. Congress Unlikely to Act on HFT

    QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.

  3. Breaking data silos is key to deploying automation beyond 'nuisance' orders.

  4. They can be used on quantum hardware expected to be available in 5 to 10 years.

  5. Regulation and Liquidity Top Concerns in Fixed income

    Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.