Understanding Your Data is No Longer Optional
By Ronald Jordan, Senior Advisor, Global Markets Advisory Group
The story of financial services has always been traders seeking out more (and better) data faster than the rest of the market, and figuring out how to profit from it. Over the years, “data management” has evolved from semaphore flags to telegraph wires, telephone lines, ticker tape machines, Quotrons, automated data feeds and intelligent algorithms. But regardless of the method, the goal has always been the same: more, better, and faster data.
Today we are awash in more data than ever, as traditional sources get faster and new sources crop up that we couldn’t profitably mine before. As a result, the world of financial services is more data-driven than ever. The opportunity for deeper insights and better decision making has never been greater.
People, processes (operations) and technology have long been the key components a company needed to deliver a product or service; in human terms, these are the vital organs of the corporate being. Not surprisingly, therefore, corporations commit the vast majority of their resources to these functions. But while the proper functioning of these organs has always been essential to a company’s health, the data that they generate has evolved into its life blood.
This data has always been there, permeating the organization, its customers, its people, and its processes. But to be useful (and usable) in the new data economy, it needs to be systemically identified, captured, processed, transformed, and distributed efficiently. In fact, there is a growing correlation between how well firms use data and how well they can compete and adapt to the realities of the new marketplace. This is evident in the efforts underway all over the financial services industry to turn “unstructured” data into “structured” data, use machine learning and artificial intelligence to analyze it, and to integrate alternative data content in core operations and processes. Leveraging these new technologies and data sources increases the amount of usable data, which can help companies streamline and rationalize organizational processes, reduce operating risks and costs, and expand, customize and improve product and services.
Beyond internal consumption, companies are being increasingly challenged to productize and export their own data. There is a growing demand among clients for data that will seamlessly integrate into their own work processes, and make them more efficient. To capitalize on this demand, firms must understand first, that the data they produce is itself a “product” with independent value, and second, that they may be able to package their raw data into data products that will expand existing client relationships and create new commercial opportunities.
In practical terms, organizations need to treat data as an asset, and devote sufficient time and attention to understanding it and nurturing it just like any other asset. At the very least, organizations should augment existing skill sets and capabilities to add a data-focused perspective to their operating fabric.
And while it’s one thing to understand that data has value, it’s another thing entirely to understand how to extract that value. A common mistake companies make is to overlook the foundations necessary to effectively manage their data or leverage the asset value of their data. Data infrastructures often are not built intentionally; they consist of a hodgepodge of legacy infrastructures, in which disparate databases do not automatically speak to each other, single data elements have multiple meanings or vice versa, and data is created to “fit-for-purpose” in specific business processes without being defined or understood. In the old order, data was simply a means to an end, so there was no need to centralize data oversight or to add structure or science to organizational data management. That’s no longer the case.
Data management science has evolved to add structure to how organizations can understand, maintain, and use their data, whether to improve operating performance, reduce operating risk, or to drive revenue or client service. Understanding the precise meaning of data, how it is being used within the organization, how it is being modified or manufactured, are some of the components of the science of data management. In the same way that “agile” and “waterfall” transformed software development and deployment by adding a new structure that better aligned goals and resources, the tools of data management science can transform how a company understands its own people, processes, and technology, as well as those of its customers.
Prioritizing and investing in data management are the next essential frontiers for all companies, not just the biggest or most established. Because after all, the better any company understands its data, the easier it is to profit from it.
Global Markets Advisory Group provides strategic advice to financial firms wishing to access, or operating within, US capital markets, around listings, regulation, compliance, operations, and technology, and has expanded to provide data management and data commercialization advisory services as well.
There is no standard approach to identify data that needs to be protected.
Industries leading this year’s D&I Index Top 100 are banking, investment services & insurance.
The new dataset combines traditional measures, such as EPS estimates, with ESG data and investor sentiment.
With Ankit Mittal, Business Change Manager, Global Trading, Schroders
Social data is more difficult to find as this component is growing in importance to end investors.