Fund Managers to Increase Spend on Data Analytics
The majority of asset managers expect to spend more on data analytics in the next three years as only a small portion feel they capture full value from the data they acquire.
Only 13% of fund managers said their organization is capable of capturing the value of data “entirely” in a study sponsored by Northern Trust and conducted by The Economist Intelligence Unit. Last September the EIU surveyed 201 asset and insurance management executives, half of whom work for firms with assets under management of more than $5bn (€4.4bn).
Dan Shoenholz, managing director of Parthenon-EY Strategy Services at Ernst & Young, said in the report: “We’ve seen a lot of investment in the last few years in costly infrastructure to enable data analytics – things like servers, information technology professionals, tools to collect and aggregate the data – and we’re on the cusp of seeing more activity in the integration and utilization of this information.”
The majority, 85%, of asset managers said their organizations only captured value from data “fairly” or “somewhat” well.
“Some data is relevant to practitioners’ current models, but much is not,” added the study. “And with no established interchange standards, asset managers are forced to expend resources on cleaning and preparing data that is often incompatible or even incorrect.”
The absence of industry-wide standards for data interchange means users are forced to allocate considerable time and labor to processing and scrubbing purchased data, dealing with incompatible formats and separating useful from non-useful data. “One in five survey respondents say that incorrect data is one of their top challenges,” said the report.
David Blackwell, head of client analytics, at UBS Wealth Management, estimated in the study that as much as 70% of analysts’ time can be spent on managing raw data, cleaning it and preparing it for analysis. “This means that only a fraction of analyst work-hours is left to extract insights and guide strategic decisions,” he added.
Neal Pawar, principal and chief technology officer at AQR Capital Management, a global investment management firm, said in the report that dat vendors can impose licensing requirements that limit the feasibility of third-party data preparation services.
“The way we scrub data is not proprietary – the secret sauce is in the models, not in our ability to parse Bloomberg data feeds,” said Pawar. “So the idea of a custodian or transfer agent or some independent party providing this as a service resonates tremendously with most of us in the industry.”
GreySpark Partners, a capital markets consulting firm, said in a new report that many financial sector firms are not using the fast processing of huge, diverse datasets to facilitate advanced analytics and deliver new insights.
“Natural language processing and predictive analytics, in particular, have opened doors for the financial services industry that were not imaginable 20 years ago,” said the report.
The study, “Big Data Use Cases in Financial Services”, said more advanced analytics, such as artificial intelligence and machine learning, are becoming a commercial reality. Artificial intelligence, or machine learning, makes predictions based on patterns that are found from analysing a vast array of market information including news and social media.
“While the use of Big Data analytics solutions is becoming more commonplace in the global financial services sector in 2016 as a means of drawing insight from vast swathes of data, the full potential of the technology is not yet being harnessed by many financial firms,” said Greyspark.
The consultancy detailed some example where financial firms have successfully used Big Data business models.
For example, hedge fund AHL Man Group replaced a range of disparate relational databases holding financial market data with one data platform, which cut disk storage by 60%, and standardized the computing language used to query the data. The report said: “For some types of analysis, quants at the firm could then model 25 times faster than they were before the deployment of the NoSQL database. As well as the use of MongoDB by the quants on market data, the firm intended to extend its use of the technology to its blackbox trading systems.”
At BlackRock, portfolio managers can access information on the fund manager’s Big Data platform which ranges from shipping trends to word search analytics in earnings calls. Greyspark said: “In the US, BlackRock is reportedly working on using data and satellite imagery to show how real-time situations are affecting portfolios, such as how many cars are in car parks of Wal-Mart stores across the country. This data could be used by an analyst to determine how much consumers are spending, and a stock-picker could use it to investigate company sales.”
Greyspark added that although its report details use cases in financial services, firms have not yet widely adopted a universal Big Data approach and commonly limit their Big Data clusters to a department, team or process.
“GreySpark anticipates that traditional relational databases and data warehouses will be used alongside Big Data Platforms for specific tasks that are purely quantitative in nature and have less touch points with other banking processes,” added the report. “The wider adoption of Big Data technology, though, hinges on the revenues and savings captured as a result of successful applications of its use cases and, no matter how impressive the technology is to technologists, investments in Big Data must be driven by the business.”
Featured image via apinan/Dollar Photo Club
There will be significant improvements in the process and result of TCA & best execution reports.
Vancouver event promises a deep dive into a critical capital markets topic.
Velocity is a specialist accelerator for asset managers.
Compliance reporting is still largely manual.
Daily foreign exchange and country media indicators have been added.