Big Data Holds Big Implications
‘Big data’ is challenging banks to capture and analyze the information that gets produced as byproducts of financial transactions.
In late July, a survey from the TheInfoPro revealed that the majority of data storage professionals do not have plans to implement Big Data analytics, with 56% of respondents indicating that they won’t deploy big data solutions even after 2013.
“Big data is a cost within itself, as identifying correlations in data is very machine intensive,” said Neil McGovern, senior director of marketing at Sybase, a technology provider. “Large upfront costs in building data sets and analysis engines are a limiting factor for many companies. However, in capital markets the benefits are far greater than the cost, so companies can afford to invest in it.”
The financial services sector is typically expert at churning vast amounts of data, gathered over decades of customer interactions and transactions. Big data is the term used to describe when data sets grow so large that they become awkward to analyze.
“Some financial institutions might be said to have perfected the art of data analytics, but there is always more to learn about how big data can be harnessed by financial institutions,” said Sriram Anandan, managing consultant at iCreate Software, an analytics provider.
A recent survey on data management in banks and investment services firms by Gartner underscores the vast potential of big data analytics within the financial services sector. Only a third of the financial sector rated quality of data (for supporting business intelligence and management decision making) high on priority, thus suggesting room for IT investments.
In addition, there is an interesting shift in the way regulators are monitoring the operations of financial services firms and insisting upon transparency.
For instance, Anandan noted, India’s central bank, the Reserve Bank of India (RBI) has directed all banks to standardize their regulatory reporting by following an automated data flow approach to ensure 100% accuracy and zero human intervention in every stage of reporting, from data extraction from source systems through to the actual submission of reports to the RBI’s Central Data Repository.
“Firms that cannot utilize complete information and firms that believed reporting didn’t really require management attention, are now warming up to the new big data reality,” Anandan said.
Data scientists across markets are becoming more prevalent. “They’re the next generation of actuaries, using the same techniques to find correlations within data sets,” said McGovern at Sybase. “In capital markets, financial professionals have developed an understanding of the statistics and analysis process, and are therefore more apt to tackle big data.”
Financial services firms have a new regulation-driven incentive to apply insightful data analytics.
“Regulators are relentlessly demanding more transparency and better risk management, even suggesting that if using better technology is the way forward, then so be it,” Anandan at iCreate said. “New regulatory demands from the RBI automated data flow, to the Basel II and III capital adequacy rules, IFRS and so on are all ‘front of mind’.”
Data extraction and integration is the second stage of a digitization process.
Financial Instrument Global Identifier enables consistency through trade lifecycle and across institutions.
Spending on ESG data has an annual growth rate of 20%.
Increased electronification has created useable and accessible real-time and historic trade data.
The GIPS standards, created by CFA Institute, are for calculating and presenting investment performance.