07.06.2012
By Terry Flanagan

Capital Market Bandwidth Requirements Escalate

Network capacity is buckling under the crush of bandwidth-intensive transactions from high-frequency and algorithmic trading.

Telecommunications is the gating factor in the ability to expand electronic trading.

“From the inception of the transatlantic cable that linked London and New York for foreign exchange, technology has transformed the securities industry,” said Faisal Hoque, founder and chief executive of BTM Corp, a business consulting and software firm.

“But today’s trading landscape is fragmented as a result of constantly evolving technologies,” said Hoque. “The New York Stock Exchange still boasts a physical trading floor, but many bourses have traded in their floors for an electronic interface.”

Hardware, software and networks have to be not only of adequate size for today but have to be flexible and quickly and easily scalable.

“Moore’s Law [the observation that over the history of computing hardware, the number of transistors on a chip doubles approximately every two years] affects both hardware and software—processing has to be able to double every 18 months,” said Chris Pickles, head of industry initiatives at BT Global Banking & Financial Markets, a telecoms group.

Networks, which have to move the messages between exchanges, brokers, banks, investment managers and central counterparties, are governed by a different law, Butter’s Law.

“Butter’s Law says you need to be able to double capacity every nine months,” said Pickles. “If you look at the expected growth in message volumes from OPRA [Options Price Reporting Authority]—heading for 13 million messages per second per firm—you can see how Butter’s Law relates to this.”

Exchanges have traditionally looked at how they could restrict message volumes so as not to have to upgrade their hardware and software as often, and so that exchange members don’t have to keep upgrading their network connectivity.

“Exchanges have suggested filtering out a proportion of the messages and ticks, but firms today want to receive every single tick, because today their IT systems can make use of that data,” Pickles said.

Firms are implementing links to exchanges and data sources that are 100 megabits per second and above—in the case of OPRA, that’s one billion bits per second and above. As always, network providers are racing to keep ahead of exponentially growing demand for capacity and speed.

“Data vendors are still paying money today to buy back from the market the old messages and ticks that fell off their screens without being stored,” said Pickles.

Exchanges have also tried compressing messages so that a larger volume fits down a smaller pipe. However, even that approach is now being reconsidered, “and the major exchanges and data sources are going for the size of network capacity that carries the data unthrottled and fastest”, Pickles noted.

Related articles

  1. Algorithms have become more prevalent in the spot FX market.

  2. Congress Unlikely to Act on HFT

    QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.

  3. Breaking data silos is key to deploying automation beyond 'nuisance' orders.

  4. They can be used on quantum hardware expected to be available in 5 to 10 years.

  5. Regulation and Liquidity Top Concerns in Fixed income

    Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.