03.26.2015

Avoiding the Performance Data Fumble

03.26.2015

A superset of IBOR, a Performance Book of Record (PBOR), gives investment firms control of their performance data

By Rich Mailhos, Product Manager, Eagle Investment Systems

If you’ve sat in on enough boardroom meetings, you’ve likely encountered or seen the dreaded performance-data fumble. The senior-most executive, spreadsheets held high to his or her face, zeros in on a number that doesn’t look right. Minutes may pass before they finally ask why there are inconsistencies in the currency effect within the attribution analysis. From there, it may take a week or more to determine that the various systems — whose inputs were used to generate the report —either applied disparate FX rate sources or produced timing mismatches that led to the inaccurate data.

Of course, this is just one of the infinite number of possible discrepancies that can produce inconsistent performance data and distrust in critical management reporting. As investment firms and asset managers grow in scale and by product set, most will create a patchwork of systems that feed into a firm-wide data store of positions, transactions, performance, benchmark and risk data. Along the way, inconsistencies can become pervasive and, at the same time, ever harder to track as the volume and breadth of inputs multiplies. Investment firms can throw resources and man hours at the problem, but the manual reconciliation of data across geographies and asset classes can actually make the issues more pronounced as discrete groups within the organization apply discordant calculation rules. This only adds unwanted subjectivity to vital information — performance and risk measures that need to be completely objective, transparent and reliable in nature.

Rich Mailhos, Eagle Investment Systems

Rich Mailhos, Eagle Investment Systems

And data quality has never been more critical. The efficiency of today’s investment landscape demands that the front office is being equipped with timely multi-level performance data that is consistent across the enterprise. As clients and consultants raise the bar in their reporting expectations, the pressure to get performance data right can be immense. And make no mistake, the regulators are watching too. The awkwardly titled Aberrational Performance Inquiry, for example, was introduced in 2011 by the Securities and Exchange Commission to use technology and risk modules in screening the performance data of thousands of hedge funds. The effort has already yielded a number of enforcement actions. Witness too the SEC’s adoption of new rules governing money market funds that require a floating NAV for institutional prime money market funds or the advanced pursuit of global regulatory bodies for granular data and risk measures on a regular basis (Basel, APRA, and Solvency II).

Many investment firms have gone as far as they can with their legacy systems. The buy-versus-build debate around a data management and performance measurement platform has largely been deferred over the past five to 10 years, as asset managers have prioritized investments in new fund products and acquisitions over replacing their at-capacity performance measurement systems. But with every acquisition or new product strategy (each with it its own regulatory requirements and reporting standards), the need for timely, accurate and consistent performance data inches one step closer to critical mass. This is the point at which a legacy platform becomes a liability that carries real risk to the entire enterprise and asset owner. Recall the collapse of MF Global, which went from a sleepy commodities and derivatives brokerage to an active trader with an aggressive prop desk under its former CEO Jon Corzine. MF Global was ultimately undone when it failed to capture the risk of off-balance sheet repo-to-maturity trades in European sovereign debt and derivative contract commitments.

To be sure, in some respects the need for consistency and a single source of investment data is what is driving many firms to seek out solutions that deliver an investment book of record, or an IBOR as it’s typically called. An IBOR, most well know, centralizes and automates a firm’s investment data to allow for one consolidated view into both start-of-day and intra-day investment positions. It differs from an ABOR, or accounting book of record, in that it’s less focused on the historical data needed for end-of-day bookkeeping and designed for real-time needs, such as collateral and cash management, exposure reporting and compliance.

Where an IBOR can sometimes fall short, however, is when the front office tries to stretch its functionality to provide performance and attribution data or comprehensive risk analysis. A typical IBOR solution, particularly those built off accounting platforms, are often missing the necessary data or functionality to calculate rates of return across all asset classes and may lack the integration and calculation rules that would allow the front office to manage benchmarks or cut the data as needed. To wit, most IBORs built off of an accounting platform or through OMS systems do not have the requisite calculation capabilities to quickly produce pre-tax, post-tax, post liquidation, and convertible-class NAV stats for retail funds or a clear view into the notional economic exposure of derivative contracts and related “underlying exposure” questions asked by investment committees. The typical IBOR would also be hard pressed to accommodate custom benchmarking that may incorporate hedged assumptions, currency conversions, carve-outs or hurdles.

Enter PBOR

This is where a performance book of record, or PBOR, comes into its own. Effectively a superset of IBOR, a PBOR provides a “true” total view of performance results, exposure analysis and enriched data across business lines and asset classes, at an even more granular level of detail. A PBOR provides firms with capabilities around data management and governance, workflow automation, data enrichment, and source system-data traceability and data validation. Accommodating both industry-standard and customizable performance calculation rules, PBORs produce on-demand portfolio recalculations, results validations, benchmark data management, as well as portfolio aggregation and composite hierarchies. These factors, while a bit esoteric, are critical in fulfilling the needs of investment firms to conduct dynamic performance measurement and thorough risk analysis, particularly as investment products take on added complexity.

We know that a distinct need for a Performance Book of Record already exists. In one survey of senior performance, data management and financial technology executives, roughly one in three identified that they use three or more systems to generate their performance data, and nearly one in ten, or 9% of those polled, claimed that they use ten or more systems.

This can create obvious problems. Beyond the potential for FX “timing” headaches, discussed above, multiple systems are subject to complications when generating performance-attribution and risk-analysis calculations, lending to discrepancies that can be nearly imperceptible when presented on consolidated spreadsheet. Consider the calculation of certain statistics used in risk analysis. For instance, how are durations and other critical analytics calculated or sourced across systems? In all cases, is it being calculated against corresponding and relevant benchmarks? Going a level deeper, what are the inputs that make up the Treynor or Sharpe ratios, or the forms of value at risk (VaR)? And are these inputs consistent across systems? To answer these types of questions, it requires a data-centric PBOR that provides both flexible reporting and analysis with drill-down tools that deliver transparency to look through underlying assets and securitized holdings.

This level of detail is critical for a number of reasons. Take your typical global asset manager. A firm that maintains offices across multiple jurisdictions, offering different product lines and asset classes that each require different business rules, can be faced with the gargantuan task of manually piecing together position, transaction, performance and benchmark data from their collection of disparate systems. To support the various discrete performance calculations — across asset types and currencies — and then put it all together, piecemeal, onto a single platform requires countless hours and diligent oversight. What’s more, performing these tasks manually creates unnecessary risks, as many firms continue to rely on spreadsheets as mission-critical tools to aggregate or derive key information. Those who read the US Senate’s report on the investigation into the London Whale scandal only had to reach page eight to find that “error-prone manual data entry” incorporating “formula and calculation errors” contributed to a reported $6.2 billion loss. (Without getting into the grainy details, the spreadsheet incorrectly divided the sum of two inputs when it should have divided the average.)

The drain on resources alone can serve as a catalyst for many firms to consider a PBOR solution. For most, however, it’s the combination of risk mitigation and a desire to make their data trustworthy and actionable, so that it not only represents an accurate and timely view of positions, performance and risk, but serves to add strategic value and improve investment performance over time. Moreover, as firms shift their fund strategies and expand into new markets and products, many are looking to re-evaluate their approach to data management altogether. As part of this, many are gravitating toward a PBOR solution that would not only enhance the ROI of such a project, but would also enable the enterprise to easily gain control of their performance data to drive consistency that meets back-, middle-, and front office needs while meeting the expectations of their clients and asset owners.

Rich Mailhos is the Product Manager overseeing the Performance Measurement solution of Eagle Investment Systems, a BNY Mellon company.

Related articles