Big Bucks in Big Data
Exchanges and other capital markets segments are discovering that underneath the challenges associated with Big Data lies a silver—and even golden—lining, in the form of new sources of revenue.
“One of the most interesting trends we are observing is that of exchanges as data producers,” said Michael Chow, director of global marketing and corporate development at GoldenSource, a provider of enterprise data management technology for financial firms.
“Exchanges are realizing that they have a compelling proposition to provide market and reference data products to downstream users, such as asset managers and broker-dealers,” Chow said.
After identifying enterprise data management as the linchpin of its strategy to sustain growth in product development, listing and trading instruments, Australia’s ASX Group tapped GoldenSource EDM to enable further innovations in this area.
“ASX had enjoyed a monopoly until recently, when Chi-X moved in,” said Chow. “As a result, ASX saw the need to upgrade its 20-year-old data management platform in order to lower costs, as well as monetize it by offering reverence data downstream.”
For operators of trade execution services, data is not merely a byproduct, but an essential deliverable.
“We’ve decided to build our solutions around data,” said Neal Goldstein, chief information officer at Liquidnet, a buy-side-focused block trading venue. “We offer people access to liquidity they wouldn’t see elsewhere. Things like business intelligence and TCS are really important to us internally, but also to members.”
Liquidnet foresaw the explosion of data, and has ramped up its data management accordingly.
“It’s been something we’ve been working on for the past couple of years,” Goldstein said. “We haven’t outsourced the data warehousing or data mining part of our technology stack. We felt like data was one of our core competencies. We’ve spent a decent amount of energy, and intellectual and physical resources to get that just right.”
With a centralized, trusted copy of company, securities and trading data, ASX will be able to flexibly process complex product sets and corporate action events, reduce time to market for launching new products and ensure consistency of timely data delivered across its exchange activities.
“GoldenSource is the central repository that allows ASX to store and retrieve corporate actions data from multiple sources, and to create new data products,” Chow at GoldenSource said.
A prerequisite for harnessing the power of Big Data is collecting it into one place in a rationalized, readily operable format.
“Data lives across dozens of disparate systems across the firm and the difficulty of connecting those cannot be underestimated,” said Joshua Walsky, chief technology officer at Broadway Technology, a provider of financial trading solutions for electronic fixed income markets.
“Then, once you’ve got it in the same place, you’ve got to make it fast enough to meet the demands of the market,” said Walsky. “Those two issues, although easy to explain and put on a PowerPoint, are tremendously difficult to actually accomplish.”
The nature of new regulations mandates greater system integration—integrating execution data with counterparty data, with clearing data with position and risk data, etc. “Technology that provides for this integration and brings systems together will yield the greatest benefit,” said Walsky.
Propagating the ‘golden’ data source to downstream systems and data users demands both ‘push’ channels as well as interfacing with systems that need to ‘pull’ information on demand.
“All information disseminated must be available in any one of a multitude of formats, all interfaced through a variety of protocols,” said Alberto Corvo, managing principal at eClerx, a financial services outsourcer.
Reference data management, for example, requires multiple subsets each with its own requirements, such as product data, client data, pricing data and taxonomy data.
“Taxonomy data is critical to identifying counterparty exposure,” said Corvo. “Firms need to manage exposures with entities and sub-entities, and if you don’t have the taxonomy right, it creates a number of problems.”
The strict methodological approach to define the process workflows and to then deliver on them is a specialized task, well suited to established process-oriented service providers.
“Running rules-based processes with highly skilled, and in many cases lower cost, resources is the main business of outsourced service providers,” said Corvo, noting their experience in building extensive data architectures, robust and secure infrastructure, automation tools and process designs.
The need to identify and quantify complex interrelationships, while not unique to capital markets, is of a scale that can’t be addressed with standard-issue enterprise data management approaches. “In this post-securities master world, the challenge is to not only create a ‘golden copy’ of instrument data but also integrate related datasets, including positions, transactions, customers and counterparties,” said Chow at GoldenSource. “The ultimate goal is to unlock the value at the intersection of these domains; this is exactly the kind of problem Big Data can solve.”
The presence of silos, or buckets of asset classes, complicates the task of managing the ever-growing streams of real-time data flowing through markets.
“These silos tend to develop not organically, but rather around specific business initiatives, such as an investment firm deciding to launch a new trading division,” said Jeff Wells, vice president of product marketing at Exegy, a market data appliance company.
Exegy provides a “ticker plant”—hardware and software that provides access to over 100 feed handles from exchanges around the world.
Makepeace was formerly the founder and chief executive of FTSE Russell.
US options market Nasdaq MRX will migrate to the cloud this year.
Institutional investors are increasingly considering opportunities in the digital asset class.
The consolidated quote system for corporate bonds has raised funds to expand outside the US.
SEC's proposed rule could result in dissemination of incomplete, inaccurate and misleading data.