For Data, Quality Trumps Speed
The speed with which market data is received by trading applications is often equated with trading performance, but the reality is that getting bad data quickly is worse than getting good data less quickly.
“Saying that real-time data gives you a competitive advantage is really nothing new,” said Tony McManus, global head of real time feeds and platform technology at Bloomberg. “What’s much more important is not whether you are using real-time data, but the quality of that data.”
The timeliness of data delivery is one dimension of the data’s quality. “If I get a streaming price coming from an exchange and it is five seconds behind where all my competitors got that price, that price is very low-quality because everyone else has already acted on it,” McManus told Markets Media. “By the time I received it, it is old news, the market has moved.”
A trading decision based on that price is likely to be a poor trading decision, so speed is important. But what also is very important is whether all of the fields correct, or are all the fields even there.
“If I am taking a real-time data feed, I get a real-time message,” McManus said. “I go into that message looking for some content and if there is nothing there, there is nothing else I can do with that. Accuracy and completeness are very important.”
According to a cross-industry study by Experian, 50% of those surveyed found fixing data quality issues the most challenging aspect of data management and 92% of companies still find some element of data quality management challenging.
“Many of the organizations interviewed rated themselves as having some data quality management tools, people and processes in place,” said Janani Dumbleton, principal consultant, Experian Data Quality, in a report. “But many still rely on manual processes and some have no data management policy in place at all.”
Breadth of coverage is important “because in the current climate, the cost of acquiring real time data is very significant,” McManus said. “If I need to go to 10 different vendors to get the content that I need, that is going to cost me an enormous amount of money…We are a one-stop shop for real-time data needs, no matter which geography you are trading and no matter which asset class you are trading.”
McManus, who joined Bloomberg in 2012 after serving as head of enterprise software at NYSE Technologies and an executive at Wombat Financial Software, which was sold to NYSE in 2008, has expertise in building real-time market data systems and selling real-time market data systems to funds, systematic traders, and sell-side firms.
At Bloomberg, McManus oversees three product lines: Server API, which provisions real-time data to proprietary and third-party trade and order management systems; B-Pipe, which is Bloomberg’s market data feed; and Bloomberg, Data Distribution Platform, a cloud-based managed service.
“B Pipe provides access to literally millions of different securities in real time across hundreds of markets and thousands of over-the-counter pricing sources,” said McManus. “If you want to bring real-time data from a hundred different exchanges into your firm, rather than going to connect a hundred times to each of those different sources, you can go and connect once to Bloomberg and we’ll give you all of that real-time data.”
There are three latency categories for market data feeds, according to McManus. The first two are ultra low latency, where the data source is co-located next to the exchange, and low latency, where data is transferred directly from an exchange to a customer site.
The third category is data that is collected from all these sources in real time and aggregated by Bloomberg, Interactive Data, Thomson Reuters, etc. and redistributed back to clients. “Because we are bringing it all in and redistributing it, it is not quite as low latency as if you were gathering it yourself right next to the exchange,” said McManus. “We don’t really service the high-frequency trading type.”
Featured image by Z
Upstart exchange has seen market share increase to near 4%.
Goldman Sachs Asset Management’s fundamental equity business manages over $20bn in thematic equities.
Data extraction and integration is the second stage of a digitization process.
With Ankit Mittal, Business Change Manager, Global Trading, Schroders
IIGCC and lead investors will launch a pilot with companies including BP, Eni, Repsol, Shell and Total.