12.15.2016

Data Debate: Speed, Quality, Fairness, and Cost

12.15.2016

It’s all about data.

Since Julius Reuter’s pigeons first flew stock prices between Aachen and Brussels in 1850 until today’s near light-speed transmission, the issues of data quality, speed, fairness, and cost have been at the forefront of trading. We all need it. We all want more of it. And it comes at us lighting fast – sometimes in nanosecond bursts –  24 hours a day, 7 days a week and 365 days a year.

The need is with good reason. Performance on the buy side and trading profitability on the sell side are directly related to information advantage. That information advantage is directly tied to the information asymmetry that one trader has over another. Simply put: the trader with the best data feeds, connections, ports, transmission lines and co-location may have an advantage — and so can execute his or her planned strategy and style better than the competition. Internal data support infrastructure at firms also affects this capability. This advantage in data transmission continues to be at the heart of regulatory concern around who gets what data, and how quickly they get it. Fundamentally, investor protection and the market structure demand for fairness, liquidity, and order are all data-driven.

The interests around data — from the exchanges that generate data (and gain substantial revenue from it) to redistributors such as Reuters and Bloomberg, to purchasers throughout the industry — are not always aligned.  Add just to add more complexity to the mix, technology continues to affect the quality, speed and cost equations.

The great debate impacts all traders regardless of type: is all data good? Why is it so expensive? Who should get the best data – everyone, or the person with the most cash?

And what about fairness? Shouldn’t everyone have access to the same data to ensure fair competition?

The quality, speed, cost, and availability of market data recently came to a head in the Great North, where just two weeks ago the Canadian Competition Bureau dropped an investigation into claims that goliath exchange operator TMX engaged in anti-competitive behavior by using its contracts with dealer firms to block rival market operator Aequitas from creating a data product that would provide a cheaper, alternative consolidated tape of Canadian market data.

Meanwhile in the U.S., Business Insider reported that the Securities and Exchange Commission is seeking more comment on an NYSE request to alter its pricing structure for certain transmission, data and co-location charges. Furthermore, many in marketplace are hoping this additional request for comment might serve as a starting point of a broad inquiry into the exchanges’ pricing policies for data, transmission and co-location fees. As background, while firms which connect directly to an exchange receive market data through dedicated WAN links connected to the exchange, others take market data from market data vendors who resell it. Market data is highly time and accuracy sensitive and literally is a snapshot of the exchange’s order book.

“The Commission is concerned that the Exchange (NYSE) has not supported its argument that there are viable alternatives for Users inside the data center in lieu of obtaining such information from the Exchange,” the SEC wrote in its request for additional comment. “The Commission seeks comment on whether Users do have viable alternatives to paying the Exchange a connectivity fee for the NYSE Premium Data Products.”

The SEC is soliciting data, views and arguments on this matter by December 12. The final deadline for all comments is December 27.

Data sellers have an obvious stake in this argument.

joe_cammarata

Joe Cammarata, SpeedRoute

What differentiate the exchanges that provide and sell their data are the monies they charge for the various levels or tiers of information they provide. The exchanges solely set the price of this data and brokers and traders must buy it — regardless of price. There is no collective bargaining for it and no group rates. To add regulation to the mix, Reg NMS created the concept of a protected quote, which in turn led to more exchanges and consequently more revenue-generating feeds.

From the exchange perspective, a NYSE spokesperson said: “The increased complexity and fragmentation in financial markets means participants now demand more information to achieve a holistic view of markets. At the same time, both per unit market data and trading costs have steadily decreased, while the appetite for data has grown.”

For a number of years exchange data revenues have become more important relative to trading and listing fees (especially as the number of trading venues and the commoditization of trading fees have become a race to zero – even to rebates). Insofar as the aggregator/redistributors are concerned, the conversation of raw into digestible formats merits a price for their services.

“Data costs what it does for basic economic reasons,” said Dash Financial’s Peter Maragos.  “Investors and trading firms need it to best position themselves to generate alpha, and brokers require it to meet their best execution obligations. Since there is no alternative it becomes a scarce resource that is in demand, thus the cost.”

Bruce Fador, President of event data provider Wall Street Horizon added the following perspective relative to technology: “The cost of computing continues to rise, especially as vendors take on more significant big data challenges. Investors are increasingly viewing these data sets as crucial to their alpha-generation process, however, with firms such as Point72, Bridgewater and others are investing very heavily in it.”

So how do traders and firms contend with the cost, both current and forecast rising costs associated with market data? For the bulge-bracket and mid-tier brokers, rising costs are likely either passed along or offset by other businesses and services the firm provides. For the smaller firms, contending with the costs of data — switches, ports, connectivity fees, etc. is a much thornier issue.

“There is no doubt that it is the smaller firms who are hit hardest,” Maragos said. “Reg NMS has created a situation where the exchanges have regulatory-granted pricing power on the data side, which they are using to offset — and many would say to a borderline egregious manner — the reduction they’ve seen in their trading-related revenues. More and more people are viewing this as a de facto tax on market participants and by extension their end customers, and a reckoning around this seems to be coming.”

Joseph Cammarata, Chief Executive Officer at SpeedRoute, was more blunt on the issue of the cost of data and its effect on smaller firms.

“It is a shakedown,” he said. “The exchanges have a monopoly and take full advantage of it.  What can you do when you are required to have the data to comply with the basic requirements of Finra and the SEC (such as 15c3-5, 605/606 reports, routing, best execution, etc.) and there is just a single source for that data?”

“What’s worse is that the exchanges have full control of the fees for receiving that data,” Cammarata continued. “They regularly change their fee schedules, which almost always results in additional charges to their already outrageous data fee schedules.  The market data divisions have become the major profit centers of the exchanges.”

He provided an example that in 2015 there was a change in the rules which resulted in an additional “Internal” redistribution fee, of which his firm was held responsible.  “Three exchanges followed suit and it has now turned into a massive added expense,” Cammarata said. “You grin and bear it, fearing retribution and audits. The frustration is that this monopolistic behavior is further squeezing decreased profits from all of the small firms.”

As in the past on other issues the perfect storm of competition, technology and regulation, driven by speed, fairness, quality, and cost will create a work in progress around data issues rather than a ‘one and done’ perspective.

Andersen Cheng, CEO of British cyber-security firm Post-Quantum looked ahead and said that the data issue will become more of a market headache in the future and might get worse for the sell side in the next twelve months.

“MiFID II is going to cause a fundamental rethink of data volumes — and will force an explosion in the amount of data that must be captured,” he said. “The General Data Protection Regulation presents further data challenges, including the individual’s right to be forgotten — which presents an as-yet unresolved dilemma when sat alongside MiFID II.”

He noted that the cost pressures associated with buying, processing and securing data, serves to keep everyone on their toes.

Mifid II, GDPR, cost and data security “are going to cause further headaches because many sell-side firms currently store some data outside of their virtual walls — through third parties or cloud providers,” Cheng added. However, “internal compliance departments are finding it increasingly hard to put their trust in an external provider because ultimately the responsibility for the data remains on their head.”

 

 

Related articles

  1. Summer Trading Network 2016
    Daily Email Feature

    Trends in Trading

    Insights from two recent industry conferences provide a snapshot of the state of innovation on the trading des...

  2. Initial focus will be on data management and analytics product development and delivery.

  3. New product addresses the specific needs of quantitative analysis and backtesting.

  4. SIP Speeding Up

    Firms says latency from Nasdaq in Carteret, NJ to TMX is less than 1.940 ms one-way.

  5. The GRSS acquisition strengthens the Euronext index franchise.