When it Comes to the Crunch: Making Sense of Big Data
Being able to understand and also leverage information from the vast amounts of data that financial firms need to store and process these days is becoming a struggle for many.
Firms are facing onerous demands to comply with a host of new regulations, which are becoming a big drain on IT budgets, as well as being able to cope with technological advancements that are allowing for the production of an ever growing sea of data.
“Financial services organizations need now to take action,” said Maurizio Bradlaw, a partner at Capco, a technology consultancy.
“Over the last two years changes in the financial industry have been substantial: spiraling costs, multiple regulations, consumer expectations and development of new, data-rich media all add to the pressure. Effective data management can no longer be misunderstood, still less ignored.”
Getting a grip of all of this ‘big data’ is one of the hot industry topics.
“This whole big data hysteria, it’s gone beyond hype,” said Simon Garland, chief strategist of Kx Systems, a provider of high-performance databases and time series analysis.
In a recent survey of European financial institutions, Capco found that awareness of data policies was low with nearly a quarter of respondents not grasping the importance of data management on a firm’s profit and loss. Almost half of the respondents just managed data as a compliance issue, having no consideration for its revenue potential, while almost all respondents had yet to explore the use of social media data.
The Capco survey also found that the way data is managed varies substantially from one institution to another and that approaches to data can change within the same organization.
“The future success of financial institutions will partly be governed by their ability to deliver a good data management structure,” said Bernd Richter, a partner at Capco.
Getting data consistency across a firm is still a problem for many, though.
“Firms are still challenged with data consistency and accuracy which raises concerns considering today’s regulatory and auditing landscape,” said Chris John, chief executive of Bonaire Software Solutions, a provider of revenue management solutions.
And given the enormous amounts of data flowing in and out of financial institutions, some firms could also even be paying different vendors for essentially the same information due to the fragmented internal environment at some firms. More efficient processes and unnecessary duplication appear to be the order of the day, then.
“Without the proper systems and practices in place to ensure data is consistent, firms risk overpaying vendor fees and have difficulty making strategic decisions without the right intelligence,” said John.
Some firms are also showing more faith in their legacy systems than maybe they should, considering the explosion of data in recent years.
“Financial professionals are more confident than they should be about the accuracy of their financial data,” said Simon Fowler, managing director, commercial division, at Advanced Business Solutions, a software provider.
Fowler said that many firms do not update information “in their finance system in real-time and the use of error-prone spreadsheets is widespread”.
“It’s time for organizations to challenge their existing financial systems and processes,” said Fowler.
It is important to maintain the voluntary nature of the standard.
Proposed changes would lead to an unsustainable level of additional cost and liability for issuers.
The regulator seeks input on the use of DLT for trading, settlement and regulatory reporting.
The strategic move taps into the existing geographic infrastructure within TP ICAP.
CSDR buy-in requirements are scheduled to go live on 1 February 2022.