Capital Markets Face Daunting Data Integration Challenges06.12.2012
For capital markets, valuable data lives in silos across multiple lines of business, geographies or product desks, unable to be normalized, related and interpreted for analysis.
The presence of silos, or buckets of asset classes, complicates the task of managing the ever-growing streams of real-time data flowing through markets.
“Management of data in large institutions is a constant struggle,” said John Bosley, chief operating officer at Bonaire Software Solutions, a software provider. “Some legacy systems are grandfathered into practice, while mandates require the upgrading or sun-setting of other systems, either by license terms, compatibility mandates or external pressure.”
Islands of automation complicate the task of distributing information in real time.
“Operations are spread across multiple geographies, often with varied regulatory and compliance standards,” said Bosley. “Locations with less oversight pressure are reticent to observe stricter measures employed by other offices.”
Since the financial crisis, firms have re-evaluated their current operational and technical capabilities, and the forthcoming regulations have pushed that evaluation even further.
“Data workflow and data management is really at the heart of regulatory risk governance, and we are seeing firms spend a lot of time and money to ensure they have a best-of-breed solution in place,” said Mark Coriaty, managing director at software provider Ledgex Systems.
“We are seeing many firms rely on specialized third-party integrators to ensure their data management strategy is in line with the impending regulations and, with the advent of private cloud services, firms note that these private services not only meet their needs, but they are also cost-effective.”
Most firms are good at storing and securing data, but struggle with cleaning, verifying and reconciling data across the organization.
One approach to addressing this problem is to implement multi-asset class enterprise services that support multiple business lines. Where there is common shared infrastructure, it is easier to implement consistent auditing and reporting services.
“Homogenization between front, middle and back offices requires massive co-ordination and effort and projects without board-level pressure will struggle to complete initial objectives,” said Bosley at Bonaire.
To successfully implement this strategy, it’s necessary to segregate core data structures from extensions needed to support a given business unit.
Many firms are well equipped to process structured data within a business unit, but they may not have the ability to efficiently process data across the entire enterprise.
However, while unstructured data such as emails, tweets and documents has existed for some time—the amount of unstructured data has exploded in recent years (at least 80% of the data produced today is unstructured) and the needs and usages for incorporating this data into data warehouses and business intelligence analytics is relatively new. As such, the market is less mature in this respect.
Database vendors are catching on to the need to store, manage and search unstructured data. In addition to new data types, newer databases (e.g., SQL Server 2012) have added rich support for unstructured documents—for example, the ability to store, search and manage millions of folders and files within the database itself.
These challenges result in a level of data distress that reduces efficiency, complicates audits and cripples business intelligence at the organizational level. With increasing regulatory pressures on the radar of board members, today’s market leaders are using this opportunity to exert top-down pressure for change.
“Home-grown systems are being abandoned for custom software developed by companies specializing in data aggregation and workflow management,” said Bosley.
“Bonaire continues to challenge leaders by providing the software tools necessary to reform antiquated data management and business practices.”
The current regulations will pose a challenge, however.
“We should view them as an operational hiccup that may slow innovation in the markets only during the short term,” said Coriaty at Ledgex Systems. “As the markets adapt and become comfortable with them, innovation of some form will continue.”
Online tool displays data on the trading of exchange-traded derivatives.
Under the new agreement, TNS can deliver TASE market data globally.
The first phase of the rewrite was originally due to be implemented in May 2022.
The new benchmarks, with Uniswap launched this year, capture 40% of value in DeFi protocols on Ethereum.
Saphyre’s AI platform allows data to be entered once for simultaneous access.