By Terry Flanagan

Big Data Looms Large in ‘14

The ability to process Big Data is emerging as one of the top imperatives for 2014 for financial services companies. With data crossing geographical, organizational, and asset class boundaries, global financial enterprises will need to blend analytics with networking, so that number crunching can be done locally, where the data resides, instead of hauling the data to a central location.

“Trading firms require as much data as possible in order to find opportunities and to mitigate risk. In a flat market, asset managers are looking more to abroad market opportunities, sometimes very far away making the costs to transmit market data back to the U.S. or Europe, cost prohibitive for research and decision making,” said Dan Watkins, director of products and marketing at network provider Perseus Telecom.

Dan Watkins, Perseus

Dan Watkins, Perseus

“Working with in-database or in-memory analytics solutions deployed across the network and co-located at market centers allows connected customers to find intelligence and to conduct research at the edge eliminating the need to pull Big Data files back into a centralized processing system,” Watkins added.

Having smart distributed systems gives firms the flexibility of a subscription model, allowing them to precisely ‘select’ certain signals or responses to queries and sending just that back home rather than lots of Big and pipe intensive data.

“Big Data is very real, especially for financial markets,” Watkins said. “However, there are an array of solutions in place that tackle these issues by taking bite-sized portions more intelligently saving time, so lowering overall latency and combing out intrinsic value, discarding noise while saving on costs.”

When IBM researchers asked respondents with current Big Data projects to identify the current state of their big data infrastructures, only slightly more than half of banking and financial markets companies reported having integrated information, although 87 percent say they have the infrastructure required to manage this growing volume of data.

“The promise of achieving significant, measurable business value from big data can only be realized if organizations put into place an information foundation that supports the rapidly growing volume, variety and velocity of data,” said Bob Palmer, head of global banking industry marketing, Big Data at IBM.

“The inability to connect data across organizational and department silos has been a business intelligence challenge for years, especially in banks where mergers and acquisitions have created countless and costly silos of data,” said Palmer. “This integration is even more important, yet much more complex, with big data.”

Integrating a variety of data types and analyzing streaming data often require new infrastructure components, like Hadoop, NoSQL, analytic appliances, real-time streaming and visualization. However, it is in these very technologies that banking and financial markets organizations are lagging behind their peers in other industries the most, according to IBM.

In other key big data infrastructure components, such as high-capacity warehouse, columnar databases, Hadoop implementations and stream computing, banking and financial markets companies are mostly on par with their cross-industry peers.

Globalization is driving innovation in Big Data.

“Sifting through historical data, recording patterns, running regression algorithms across large files means tens or hundreds of terabytes of historical data per market,” Watkins said. “Moving that data over a Wide Area Network is very expensive and preventative in global trading where its all about discovery and then participation.”

Perseus Telecom supports pushing applications at high speeds and controlled costs across its network so customers can create market value. “Requested files of value are all that is needed, so it is all that needs to be transmitted say from South Africa to London, or from Singapore to Chicago, Chile to New York and so on,” Watkins said.

Related articles

  1. Data extraction and integration is the second stage of a digitization process.

  2. Shedding Light on 'Dark' Data

    Financial Instrument Global Identifier enables consistency through trade lifecycle and across institutions.

  3. Spending on ESG data has an annual growth rate of 20%.

  4. Increased electronification has created useable and accessible real-time and historic trade data.

  5. Trade Reporting In Focus

    The GIPS standards, created by CFA Institute, are for calculating and presenting investment performance.