01.19.2012
By Terry Flanagan

Europe Builds Out Colo

Centralized financial hubs are spreading across the Continent.

European data center providers are creating centralized hubs for financial data, thereby changing the paradigm for sourcing market data.

Through centralized financial hubs, data processing is able to be done much closer to the source. This gives firms the ability to better filter out and optimize acquired data a lot earlier in the process.

This not only keeps services competitive in the financial sector, but also ensures the most effective liquidity access.

The low-latency connectivity to multiple venues and wide choice of connectivity provided by carrier-neutral data centers grants companies the ability to maximize their reach towards customers and minimize onboarding time for those customers.

Infront, a provider of real-time market data and electronic trading solutions, houses its solutions at Interxion’s Stockholm data center and City of London data center, in order to provide its financial customers with faster connectivity to exchanges and key liquidity venues.

“Infront can locate much closer to exchanges and trading engines in Europe, thereby achieving some of the lowest possible latencies to Europe’s leading liquidity venues,” Patrick Lastennet, financial services director of marketing and business development at Interxion, told Markets Media.

Infront has access to more than 400 different carriers. Additionally, as Interxion has 28 data centers across 11 different countries in Europe, locating with Interxion has given Infront the opportunity and ability to easily expand their operation and services to new locations, said Lastennet.
In addition to Interxion’s superior connectivity, power, coverage and proximity to liquidity venues, financial firms have found great benefit in the communities located within Interxion’s financial hubs.

The critical mass of exchanges, MTFs, sell-side and buy-side firms, market data vendors, clearing houses, as well as technology and connectivity vendors housed in each data center reduces latency, complexity and costs of doing business.

“For instance, if a firm was sourcing a data feed from an exchange, the total throughput for the subscription would be around 100,000 messages per second at its peak, with an average message size of around 50 bytes,” Lastennet said.

In this scenario, with the firm processing data remotely from the exchange, its network needs to be able to handle 100,000 messages at 50 bytes per second, which is required even if it doesn’t really need all 100,000 of the messages, he said.

If the firm is processing data in proximity to the exchange source, however, it can apply a filter to only source data from instruments it is interested in and format the data in proprietary messages so they are shorter than those of the exchange.

“So, in this example, the throughput required would then become 50,000 messages at 30 bytes per second, meaning less bandwidth is required for the firm,” said Lastennet. “Being located close to the data source helps financial firms save substantial amounts of time and money by only focusing on the data that matters and reducing latency through proximity and network optimization. “

Related articles

  1. Algorithms have become more prevalent in the spot FX market.

  2. Congress Unlikely to Act on HFT

    QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.

  3. Breaking data silos is key to deploying automation beyond 'nuisance' orders.

  4. They can be used on quantum hardware expected to be available in 5 to 10 years.

  5. Regulation and Liquidity Top Concerns in Fixed income

    Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.