Data Mastering and Distribution in Focus
Data Mastering and Distribution – Putting New Data in the Hands of the Business
Recent changes in business and regulatory reporting requirements have compelled financial institutions to put in place data management models that enable the management of and response to real-time data requests from diverse sources. And as the industry now starts to explore the growing range of innovative data sources, from sentiment analysis to satellite imagery, to add depth to traditional data resources, there is a significant opportunity to gain new insights through machine learning, correlation and pattern spotting – insights that can improve time to market and drive down costs.
Yet unlocking this new data diversity requires a fundamental change to traditional data management; financial data models must support not only fast and effective on-boarding of new data sources but also efficient data exploration and easy access and distribution of data across the business.
As Martijn Groot VP, Product Management, Asset Control, explains, traditional EDM models are being replaced by a new generation of data services that address the full information lifecycle, from data sourcing to integration and distribution.
New Data Services
From investment assessment to Know Your Customer (KYC), the way in which financial institutions consider decision making is set to change fundamentally over the next year as organizations begin to on-board and explore the new raft of data sources. While the new data model has been driven by regulatory demands, the sheer depth of information now created and collected globally is extraordinary – and is set to take the industry far beyond the traditional catalog of price and reference data sources.
From web crawling to mine news and spot corporate events to sentiment analysis, satellite and geospatial information, traffic and travel patterns and property listings – the way in which organizations can now analyse investment opportunities, track Politically Exposed Persons, and company news, is being transformed.
No longer will organizations be limited to published financial statements and earning calls; instead investment decisions can be based on a much broader and deeper – but potentially also murkier – set of data. For example, the addition of social media sentiment analysis, combined with satellite information tracking car park usage, can deliver a new level of understanding into a supermarket’s performance. Indeed, with the availability now of transcripts of all earning calls, it is possible to understand who is asking specific questions and how CEOs and CFOs respond – insight which can be tracked and analyzed to deliver fast, actionable investment insight. Similarly, with KYC – the ability to rapidly deep dive through multiple diverse data sources provides a chance to address the escalating overhead associated with customer on-boarding and reduce the cost of doing business.
New Mastering Model
The challenge, of course, is to find a way to harness these new data sources; to on-board this new insight into a way that is fast, effective and usable. Where does this leave traditional EDM solutions that have played a vital role in managing traditional data sources? The mastering process must still provide a 360-degree version of the truth that can be used across the organization, from valuations to risk and financial reporting; the addition of data sources reinforces the need for excellent structured processes that compare sources to find discrepancies and deliver that golden source. But this process must now also deliver excellent integration – with organizations looking for robust Application Programming Interfaces (APIs) to enable the fast stitching together and exploration of these new data sources.
In addition to adding new depth to traditional information, these data sources also change the emphasis of the mastering process. Rather than focusing on error detection in order to achieve consistency and accuracy, these sources enable organizations to undertake pattern discovery, leveraging new techniques, including machine learning, to spot new correlations or reveal unusual activity, within market surveillance for example.
Speed is critical; with the vast number of new data sources available, fast, effective data discovery will be essential to drive down the cost of change and provide organizations with a chance to gain differentiation in time to market. Intelligent data mastering is at the heart of this new model. Combining APIs that enable integration with an easy process for testing and on-boarding these new models in production will be essential – and that will require APIs that support popular data science languages, including R and Python. In addition, the use of NoSQL technology combined with the ability to deploy new models close to the data will be key to supporting the significant associated data processing demand.
This ability to combine robust data mastering processes with excellent integration will build a new data foundation; it will enable an organization to pull together these diverse data sets, create new insight based on sentiment from social media and performance from satellites and the traditional measures of price history and published financials.
To maximize the value of these data sources, organizations also need to reconsider access and utilization. Making these new data sets easily accessible, not only to new algorithms and data scientists but also to end users within risk, investment, operations or compliance will mark a significant step change in data exploitation.
Ensuring the data easily integrates with the languages adopted by data scientists is fundamental; but to deliver the immense potential value to end users, data analysis must evolve beyond the traditional technical requirements of SQL queries. Offering end users self-service access via enterprise search, a browser, Excel and easy to understand interaction models rather than via proprietary APIs and custom symbologies, will open up these new data sources to deliver even greater corporate value.
These new data sources are radically different to the traditional data resources – and their potential value to an organization is untapped. Pattern matching in particular can be oriented not only towards improving operations or reducing risk but also towards improved pricing and new revenue opportunities. Matching of data items will not only take place through common keys but also through spotting the same behavior in hitherto unrelated data or otherwise finding repeating patterns in time, space and across different data sets.
Especially for active investment management, the use of non-traditional data sources can help compete and differentiate against passive investment strategies; while in compliance and risk management, accessing a broader range of sources can help trigger early warnings on suspect transactions, relationships or price movements.
The potential is incredibly exciting – and first mover advantage cannot be overstated. The key for financial institutions over the next year or so is to move beyond traditional EDM models and embrace the new mastering and distribution services that will enable essential exploitation of data across the business.
ApeVue provides daily prices for around 100 non-public companies.
Limited competition for benchmarks and indices, credit ratings and trading data may increase costs.
The LSEG data centre will relocate from the City of London to a new London site this year.
Demand for fixed income data, already voracious, continues to increase.
Special-purposed acquisition companies have a unique set of data points for investment decision-making.