10.05.2012
By Terry Flanagan

The Big Deal About Big Data

‘Big data’ technologies that were originally developed by Google, Amazon and other web-based companies are being leveraged by financial markets firms to store, retrieve and analyze massive amounts of data.

Big data refers to the large and quickly growing volumes of data from various sources (e.g., financial exchange data, retail point of sale data, sensor data, social media data, utility meter data and geospatial data).

“For capital markets, big data is a ‘big deal’ since regulators—Finra [Financial Industry Regulatory Authority], the SEC [Securities and Exchange Commission] etc—and other financial research institutions need a quick and seamless ways of receiving and interpreting a large capacity of data in high volumes,” said Matt Benati, vice-president of global marketing at Attunity, a technology provider.

For instance, Nasdaq is utilizing Amazon Web Services (AWS) to give U.S. brokers a new way to store data and records that regulators require them to maintain, Benati noted.

New regulations are affecting the buy side’s requirements for data collection, storage and management.

“Bigger data needs better tools to analyze what you’ve got,” said Jay Hinton, global product manager at Mantara, a technology provider. “In turn, better tools require better data and better data requires more storage and compute power.

“Ultimately, all of this will end up in a cloud, which seems like a cheaper option, with the ability to scale. It will be interesting to see if it plays out in reality.”

Attunity, a provider of information availability software, has enhanced CloudBeam, its recently-introduced software-as-a-service platform for AWS, to provide a new data replication-as-a-service solution for AWS’ Simple Storage Service (S3).

The service provides replication and synchronization of big data stored in S3 across AWS cloud regions to enable business-critical initiatives, including disaster recovery, back-up and data distribution.

“Attunity’s latest deal with AWS could lead to Attunity’s involvement in the easy transfer of this stored data to the necessary mediums for analytics,” said Benati at Attunity.

Big data analytics technologies encompass data warehouses, Hadoop and stream processing systems. Each technology serves a different need and combined help enterprises cover the spectrum of big data analytics.

“The goal of these technologies is to provide fact-based answers to a company’s questions—no longer do companies rely on ‘gut checks,’” Benati said.

Data warehouses use predictive analytics to enable companies to process massive amounts of structured information, often measured in petabytes.

Hadoop is focused on deriving value from unstructured historical information—the type of data cited with the highest growth rate and attributed to the vast array of social networks.

“These two technologies look holistically at these very large and highly varied data stores to generate targeted answers,” said Benati.

Stream-based technologies, on the other hand, work directly off real-time data feeds and therefore rely on smaller windows of time to make decisions. In these stream-based scenarios, real-time data is essentially compared to historical data to spot similarities.

Related articles

  1. Upstart exchange has seen market share increase to near 4%.

  2. Goldman Sachs Asset Management’s fundamental equity business manages over $20bn in thematic equities.

  3. Data extraction and integration is the second stage of a digitization process.

  4. With Ankit Mittal, Business Change Manager, Global Trading, Schroders

  5. IIGCC and lead investors will launch a pilot with companies including BP, Eni, Repsol, Shell and Total.