Savvis Launches Big Data Service
Big Data is hitting the financial markets in a Big Way, as firms struggle to keep up with reams of structured and unstructured data.
While Big Data presents unique challenges in terms of storage and IT infrastructure, it also presents unique opportunities, in particular, the prospect of mining data for correlations that can be used to generate alpha.
“In capital markets, firms are trying to find actionable intelligence,” said Roji Oommen, managing director of financial services at Savvis. “Can I generate alpha with information?”
Big Data solutions have emerged in the form of Apache Hadoop, an open-source framework for processing vast amounts of data (multi-terabyte data-sets) in-parallel on large clusters (thousands of nodes) of commodity hardware, and commercial offshoots like MapR and Cloudera.
“Hadoop is the core open source technology upon which all the various distributions are based,” said Milan Vaclavik, senior director for Big Data, global solutions management at Savvis. “There are commercial distributions such as MapR and Cloudera that take the open source stack and have added their own intellectual property, while at the same time making sure they are contributing back to the open source community.”
Hadoop is a distributed file system that leverages commodity hardware and storage. “It allows you to link together commodity hardware and create the equivalent of grid computing, in the process generating massive processing capabilities,” said Vaclavik.
Savvis has launched Big Data Solutions, a suite of managed hardware and software services for optimizing data storage, integration, retrieval and analysis.
The first set of services is called Savvis Big Data Foundation Services – a suite of standardized, fully hosted and managed services designed to help organizations glean the most value possible from all their data. The suite includes Savvis’ managed services for Cloudera and MapR platforms based on Hadoop.
“Big Data Foundation Services is a fully managed Hadoop distribution, using either MapR or Cloudera. We manage the Hadoop infrastructure and the Hadoop environment,” said Vaclavik. “We are running it as an Infrastructure-as-a-Service [IaaS] by leveraging the networking capabilities of Savvis and CenturyLink [Savvis’ parent company]. As part of Phase II we will be offering analytics services as well, layering that on top of the Foundation service.”
The rationale for managed services in Big Data for capital markets rests on cost and efficiency: managing huge data sets is no longer a core competency for even the largest firms.
“Technology infrastructure is no longer a differentiator,” said Oommen. “Market has matured to the point where investing time at the infrastructure layer is no longer cost-effective; analytics and data are what’s important, and implementing Hadoop is difficult at the individual firm level.”
That’s where hosting services come in.
“At Weehawken, N.J., Savvis hosts 20% of the U.S. equities market,” said Oommen. “We have storage array after storage array. Capital markets have a huge data storage problem, and want to align with Hadoop, which is solving the problem on a massive scale. Finding the two or three correlations that matter for generating alpha requires enormous computing power.”
Algorithms have become more prevalent in the spot FX market.
QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.
Breaking data silos is key to deploying automation beyond 'nuisance' orders.
They can be used on quantum hardware expected to be available in 5 to 10 years.
Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.