Easing the Burden of Big Data
The increased complexity of financial markets, a rise in automation and added regulatory burdens are all making the task of collecting, and making sense of, data ever more difficult.
But firms are being urged to get a grip of all of this ‘big data’, as otherwise these data sets will become too large and complex for firms to process and handle.
And with regulators now looking to impose more stringent checks on financial services firms in light of the global financial crisis, the time may now be right for firms to fork out on some extra IT spend to cover all bases.
“If the regulators suddenly descend upon you on ropes from the ceiling to come and inspect what you are up to and they ask a question and you can quickly type something and the answer is back with you in seconds, and that happens a few times, then they just assume that you have your act together,” Simon Garland, chief strategist of Kx Systems, a provider of high-performance database and time series analysis, told Markets Media.
“But if you say, that’ll be difficult, I’ll have to phone someone from the IT department to come and give us a hand then the regulators are more likely to start making themselves comfortable for a long stay.”
The growing volumes of derivatives and trading volumes in FX and equity markets, as well as regulatory requirements, are all resulting in institutions having to store and analyze vast quantities of data.
“The increased complexity of the markets and continued race towards automation, across more asset classes and venues, means that the enormous growth of data will continue,” said Daryan Dehghanpisheh, global director, financial services team at Intel, the multinational chip maker.
Kx has just released kdb+ v3.0, a data analysis and storage system, which promises considerable improvement in processing speeds when running on Intel’s most recent processors, support for WebSockets, GUIDs/UUIDs—unique identifiers, which facilitate the design of distributed systems—and simplified storage of billions of records.
The optimized code in kdb+ utilizes the processor specific instructions available at run-time, while the simplified storage in v3.0 makes the design and implementation of large systems much less complex. While kdb+ has always been able to handle far more than two billion records, this has been made much simpler in Kx’s new release.
“Kx and other array languages have been doing this for 20 years but it hasn’t been given the same attention,” said Garland. “We’ve been doing big data since forever anyway. This whole big data hysteria, it’s gone beyond hype. But speed is more important when the data is big.”
Garland added: “This enhancement in v3.0 simplifies the design and implementation of large systems which have to handle more than a trillion records, allowing for a more elegant architecture.”
The addition of UUIDs as a basic data type means that distributed systems are now easier to write. UUIDs can be used to uniquely identify distinct records and are a valuable tool for managing distributed systems. In highly complex systems, which are spread across different regions and continents, UUIDs make distributed processing more efficient and system design more straightforward. At the same time, storing and processing transaction IDs, such as order and confirmation IDs, is easier and more efficient.
“Managing multiple servers across different countries and continents can be a challenge and requires some complex programming,” said Garland. “UUIDs make this much more straightforward, as individual records can be uniquely identified; combined with the speed enhancements in v3.0 and ease of handling hundreds of billions of records, more efficient systems can be designed. This is an important step forward, especially in the face of ever-growing data volumes.”
Another new feature in kdb+ v3.0 is the introduction of support for WebSockets, which allow for a direct, bi-directional, full-duplex connection between a browser and an application. This offers greater scalability and much faster processing and is particularly useful for high-performance browser-based applications, for example applications visualizing real-time data.
“The new version of kdb+, running on the latest Intel processors, provides market participants and technologists with new capabilities, levels of performance and flexibility,” said Dehghanpisheh at Intel. “The result is a powerful tool in the hunt for alpha, while ensuring maximum stability and reliability.”
The Nordic and Baltic exchanges had record IPOs and trading volumes.
It is important to maintain the voluntary nature of the standard.
Proposed changes would lead to an unsustainable level of additional cost and liability for issuers.
The regulator seeks input on the use of DLT for trading, settlement and regulatory reporting.
The strategic move taps into the existing geographic infrastructure within TP ICAP.