How Real is Real-Time Analytics?10.25.2016
Market participants across the institutional buy side and sell side are pulling forward their analysis of trading, risk and compliance, to appease ever-more-demanding regulators and clients and to keep up with the competition.
In other words, first thing in the morning is no longer good enough as an analytics delivery timeframe.
“Firms are increasingly having to use real-time analytics for a range of different purposes,” said Rodney Taylor, Market Development Manager, Europe at technology and data provider Thomson Reuters. “That has changed the way they consume data, the way they analyze data, and the way they disseminate data, throughout the organization.”
For the end-user institutional trader, data is massive and multidimensional, spanning areas such as liquidity provider, trade venue, execution strategy, trade size, and time zone.
“Cleaning, managing and analyzing data is a significant challenge,” said David Biser, senior FX trader at Campbell & Co., a systematic, quantitative investment firm with $4.3 billion under management. “The difficulty is exploring this data at its maximum granularity to achieve the best assessment of implicit costs.”
Trading desks leverage data to achieve a number of specific objectives, such as determining implicit and explicit execution costs, deciding how to size and time individual trades, and evaluating algorithms and counterparties. “Conducting this statistical analysis allows traders to make more informed decisions,” Biser said. “From a best-execution standpoint, it also allows one to inform the decision-making process of the firm, and then use the data to justify trading decisions.”
Analytics turns data into actionable insights. Real-time analytics does the same thing, but on the spot rather than minutes, hours or days later.
Raising the bar on ‘big data’ and analytics is big business, and it’s rapidly getting bigger. IDC forecasts a 50% increase in revenues from big data and business analytics software, hardware, and services between 2015 and 2019, from about $122 billion last year to $187 billion in three years’ time. Banking and manufacturing-led industries were poised to spend the most, according to the research firm.
Companies throughout the corporate landscape are seeking to exploit cutting-edge analytics to effect digital transformation, adapt to disruption, and gain a competitive edge. Drilling down from that commonality, each industry has its unique drivers as well. In the capital markets business, one primary driver of the demand for real-time analytics is regulation.
Pertinent rule sets include Markets in Financial Instruments Directive (MiFID) II in Europe, which is meant to go into effect in January 2018, and Dodd-Frank in the U.S., which was signed into law in 2010 but remains a work in progress.
Regulation does not explicitly call for market participants to install or upgrade to real-time analytics. But there are a number of regulatory thrusts that make, or will make, real-time analytics a virtual necessity, at least for certain asset classes in certain regions.
For example, a broad aim of MiFID II is to boost transparency in securities trading, especially in over-the-counter (OTC) markets in which swaps and other derivatives change hands. Drilling down into the rule set, MiFID calls for trading venues, investment firms, and systematic internalizers to report pre-trade bid and offer prices on a continuous basis. On a post-trade basis, trading venues and investment firms will need to report price, volume, and time of trade at as near to real-time as possible.
The complexity, detail and interpretations of the voluminous regulations are still being sorted through, but at least one assertion is broadly accepted, and that is that legacy analytics systems aren’t capable of clearing the higher goalposts.
Adding to the challenge is that certain regulatory provisions, such as those around encryption and customer privacy, may force market firms to make fundamental design changes to their data systems. “How you architect your analytics and still meet regulatory requirements is key,” said Taylor of Thomson Reuters. “Some large organizations think they’re going to have to completely re-architect the way they store customer data.”
A primary subset of analytics in capital markets is transaction cost analysis, which measures the efficiency of trades. Historically disseminated via next-day reports highlighting price slippage and quarterly meetings to assess longer-term trading-strategy performance, today’s state-of-the-art TCA enables a trader to optimize in-progress executions, rather than just providing data on trades that already happened.
That’s in the equity market, which has led on TCA since the inception of electronic trading decades ago. In OTC and other markets that have less standardization, shallower liquidity and limited historical data compared with listed stocks, TCA is in its early stages. But with regulators steadfast that best-execution standards be applied more broadly across markets, its development is being fast-tracked.
Whereas real-time analytics was once a curiosity that begat the question of ‘what do I do with this?’, market participants are increasingly seeing the value. “Asset managers are recognizing that in order to do their jobs better — whether that’s in terms of monitoring costs, looking for trading opportunities, or having better-informed overall trading procedures — they need to have access to real-time analytics,” said Spencer Mindlin, an analyst at Aite Group.
“There’s a decreasing tolerance for batch-process or overnight analysis of exposure,” Mindlin said. “There’s just an insatiable desire to know your market and counterparty exposures as close as you can to real-time. That is being driven by regulation.”
Regarding the hosting of Analytics as a Service (AaaS), observers note that the trend is moving toward the cloud — slowly.
“With real-time analytics, we still mostly see people capturing data in their own data centers, typically on one machine, because they can add all the data together in one space,” said Fintan Quill, global head of sales engineering at in-memory streaming analytics database provider Kx Systems. “It’s just faster. If you’re doing very latency-sensitive processing or analytics, you get results straight away whereas if it’s distributed you have the network latency.”
Cloud offers the advantages of IT cost savings and reduced capital expenditures, and security concerns have eased a bit. For some market participants, a hybrid model with the most proprietary information staying on the ground makes the most sense. “People might have their market data servers in the cloud, but they’re probably a little reluctant to put their own order and execution data in the cloud as well,” Quill said.
The emergence of real-time analytics holds different implications for different market constituencies. Institutional investment managers want real-time analytics, but they may not have the expertise or budget to support it. ‘Bulge bracket’ broker-dealers have the resources to develop real-time analytics, but they aren’t the technology leaders they once were, and they have multiple other business lines to tend to.
That leaves an opening for managed-service providers, who focus on technology and can provide a top-shelf offering at a reasonable cost.
“When you look at something like pre-trade analytics, firms are looking for vendors to provide the same quality of service that they have in-house, especially around latency,” said Tom Kennedy, global head of analytics at Thomson Reuters. “Getting a feed for that and managing the analytics is critical within their workflow.”
Business conditions are also driving demand for real-time analytics. In capital markets, that starts at the top of the food chain with alpha-challenged and cost-constrained institutional investors and asset owners, who are asking more from their brokers and technology providers.
“As clients continue to emphasize the need for best execution at the lowest cost, this demands ongoing investment in information systems and capabilities to parse and analyze the underlying data,” said Matthew Rowley, chief technology officer at agency broker WallachBeth Capital.
Such capabilities can provide a competitive advantage by enabling traders to trade better. “As firms take a more data-centric approach, traders will be supported by evidence-based decision-support systems,” Rowley said. “This is the crux of the ‘real-time analytics’ slogan — actionable analytics. Numbers are meaningless unless they have a decision attached to them.”
As for the future, Rowley said “brokers’ prospects are tied to the strength of their technology and vision, and their proprietary modelling ability to take advantage of the massive amounts of real-time data, both public and internal.”
Biser, the Campbell & Co. trader, stated: “As technology and analytics improve, so too will the feedback loop of execution decisions. Trading is a process more than it is one single outcome.”
With Alastair Clarke, European Equity Trader, Capital Group
More than 15 sell-side and 20 buy-side firms have joined the platform.
Analytics will cover portfolio construction, pre-trade risk and post-trade performance attribution.
Virtu aims to help clients interact with markets more efficiently so they can mitigate risk and lower costs.
ESG is increasingly at the forefront of decision-making for investment managers and asset owners.