A Mountain of Data
Big Data enters capital markets lexicon.
The capital markets industry is dealing with significant challenges owing to exponential growth in the volume and velocity of data.
This manifests in the need to capture, analyze, and transmit very large amounts of data in a very short span of time.
In 2012, the world will create about 1.5 zettabytes of information, most of which will be unstructured data.
One zettabyte is 10 to the 21st power. It’s been estimated that all of the data on the internet as of 2009 would add up to half a zettabyte.
“The exploding volume of data available from an exponentially growing range of sources is both a massive challenge and an unprecedented opportunity for radical and fundamental change in the industry,” said Phil Lynch, CEO of Asset Control.
Developing risk management, trading and quantitative analysis applications in the era of Big Data poses some monumental technological challenges.
These include loading massive volumes of time-series data from multiple sources in real-time, analyzing high-speed streaming data with very low latency, and supporting rapid development and backtesting of quantitative models against years of historical data.
Asset managers are being challenged to monitor real-time value, exposure and sensitivities across a complex portfolio.
Paladyne’s modelling and valuation tools have been expanded to include additional model libraries, enhanced sensitivity analysis, and more seamless real-time access to market data (including curves, volatilities and spreads).
“This is aimed mainly at the front-office where real-time views of value and exposure are being demanded more and more,” said Sameer Shalaby, CEO of Paladyne. “It is no longer good enough for managers to revalue their derivatives once a week or once a day, as used to be the case.”
In order to perform advanced analytics and calculations required to support electronic trading strategies, firms must implement platforms that can store greater quantities of data and quickly retrieve and accurately process historical and time series data.
Firms are attempting to synthesize traditional database technology for structured data with emerging Big Data technologies for unstructured data, such as Hadoop.
Massively parallel processors, which involve the coordinated processing of a program between multiple independent computers, each with its own operating system and memory, were cited by 31% of respondents as a potential solution, according to a survey of data professionals by A-Team Group.
In-memory databases, which store data in main memory rather than on disk, were cited by 17% of respondents, and NoSQL, which are shell relational database management systems that don’t use Structured Query Language, were cited by 15% of respondents.
“Vector storage, rather than traditional relational databases, will be needed to understand complex trends and scenarios,” said Oliver Muhr, executive vice president of SunGard’s MarketMap business unit.
New regulations, such as the Dodd-Frank Act, Markets in Financial Instruments Regulation (MiFIR), and European Market infrastructure Regulation (EMIR), which will cause the majority of OTC derivatives to be traded on exchanges, centrally cleared, and reported to trade repositories, will create new challenges as well as potential avenues for growth.
“Companies have, right now, the opportunity to discuss and identify the added business values that they want to get out of this Dodd-Frank compliance exercise,” said Ian Jones, senior strategist for commodity risk at SAS RiskAdvisory.
The regulations mandate a more sophisticated approach to the collection of data from multiple systems and the ability to aggregate that data, report it out and then maintain it in fully auditable way.
“This required level of data management opens up an opportunity for a more comprehensive – and more real-time – capacity to run all manner of analytics on this new, aggregated dataset,” said Jones.
Corporate Sustainability Reporting Directive elevates sustainability information to same level as financial.
Essentia analyses data to create behavorial “nudges” for fund managers' investment decisions.
Snowflake’s Financial Services Data Cloud helps data flow more seamlessly across industry transactions.
Artificial intelligence and machine learning, powered by cloud, are moving into mainstream use.
The trade repository failed to ensure data integrity and provide access to regulators.