Capital Markets Embrace Big Data
New data visualization software tools are enabling capital markets firms to embed tools directly into their own applications, facilitating mobile analysis and reporting.
By embedding advanced data visualization tools into existing applications, firms can improve user acceptance and utilization of the application by adding interactive, graphical front-ends to their systems.
“This will allow users to take action directly from within interactive dashboards that are integral to the host application, thereby reducing the amount of time required to act based on the results of visual data analysis,” said Willem De Geer, managing director at Panopticon Software, a provider of visual data analysis software for real time, complex-event processing (CEP) and historical time-series data.
Panopticon has announced a new version of its platform that supports Windows and Java IT environments, allowing clients to embed visual analysis-enabled Panopticon dashboards into their own apps.
“Our capital markets clients, who include JPMorgan Chase, Citadel and BlackRock, have recognized this for some time and have made visualization an integral part of their ‘big data’ strategies,” De Geer said.
Data visualization is a critical component in big data implementations in capital markets.
“The amount of data coming into a firm is staggering and it is impossible for humans to make intelligent use of it without visualization supported by an appropriate infrastructure, which may include CEP, in-memory processing, columnar tick databases and the like,” said De Geer.
For quantitative analysts, data visualization is a primary means of making sense of millions of data points, both real-time streaming and historical.
“Data visualization is part of the solution for examining many markets and investigating millions of possible strategies,” said Elliot Noma, founder of Garrett Asset Management, a systematic trader in financial and commodity futures. “We will backtest them in terms of how they’ve done historically, and update subsets of strategies to see how well they’ve performed.”
Capital markets firms are dealing with exploding data volumes and increasing velocity driven by high-frequency trading, new technology and regulatory requirements.
“New technologies like CEP produce real-time streams of data that have been processed according to complex algorithms and business logic programmed into the CEP engines,” De Geer at Panopticon said.
The result is “the availability of new data sets and data feeds, including, but not limited to, news sentiment analysis data streams that traders and other people within financial firms require in order to take advantage of profitable trading opportunities”, he said. “New regulatory requirements require firms to be able to monitor risk, performance and other factors on an intra-day or on-demand basis,” added De Geer.
The Panopticon system is designed to allow business users, without the delays caused by involving IT specialists, to design, modify and share their own dashboards based on their unique perspectives on current business challenges.
“This flexibility is key in a big data environment, particularly in capital markets, where fast-changing events and market moves often create the need to new perspectives on all available data,” De Geer said.
The new version features improved connectivity for high performance columnar tick databases, including Kx Kdb+ and OneTick, as well as connectors to subscribe to Oracle, OneTick, SAP Sybase ESP, StreamBase CEP and LiveView.
Algorithms have become more prevalent in the spot FX market.
QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.
Breaking data silos is key to deploying automation beyond 'nuisance' orders.
They can be used on quantum hardware expected to be available in 5 to 10 years.
Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.