03.22.2012

Points of Reference

03.22.2012

Capital markets firms continue to face thorny challenges associated with managing reference data – the names and identifiers associated with information about counterparties that is needed to determine a firm’s risk exposure.

Reference data management is fundamentally about organizing existing information that lies scattered about the enterprise into a logical whole.

“To us, Big Data means aggregating very disparate content sets–securities masters, corporate actions, real-time and historical pricing–into a single coherent collection,” said Marc Alvarez, senior director of reference data infrastructure at Interactive Data Corp.

Reference data that is decentralized across multiple systems magnifies the difficulty of maintaining accurate copies of reference data.

“Facilitation and assignment of unique customer identifiers requires a general cleansing, augmentation and de-duplication of data within the master reference database,” said Joseph Santangelo, principal consultant at Axis Technology, an IT consultancy for data management and security.

Inconsistent, incomplete or inaccurate reference data is a leading cause of internal straight-through processing failures.

“A large portion of a trade record is composed of reference data, and a significant amount of transactions breaks are caused by poor quality reference data,” said Santangelo. “The costs to repair trades and correct mismatches are significant and increase as errors pass through front to middle to back office systems.”

Institutional asset managers have a tuning-fork relationship with their data providers, upon whom they depend for accurate market data to fuel their investment strategies.

Michael Levas, Director of trading at Olympian Capital Management

“We get our market data from a number of sell-side brokers, including our prime broker,” said Michael Levas, senior managing principal and director of trading at Olympian Capital Management. “We utilize their platforms on a daily basis to obtain information from all over the world.”

Noted IDC’s Alvarez: “Whereas customers operate in specific asset classes and geographies, data providers such as IDC need to operate on a global scale across geographies and asset classes. With the growth in volumes and complexity of applications, customers are leveraging our core competence as data managers.”

Reference data management must be undertaken from the perspective of the corporate end user.

“In order to focus on the right information, we continue to work to ensure we have gold copy information that is delivered in a format that is easy to use that matches the needs of portfolio managers,” said Rodney Comegys, principal and head of the Index Analysis and ETF Trading teams at Vanguard Equity Investment Group.

A reference data management system must be capable of managing complexities, idiosyncrasies and interrelationships that characterize reference data. “It requires a multi-faceted approach, involving the ability to access fragmented counterparty information, to manage legal hierarchies and cross-entity relationships,” said Ravi Shankar, senior director, master data management product marketing at Informatica.

Key reference data – for example client, account and counterparty– is often stored in incompatible data silos, complicating the data access function. “Customers require technology to access all counterparty data irrespective of the system in which it resides,” said Shankar.

A core requirement is the ability to syndicate and share consistent reference data across the front, middle and back offices.

“Syndication involves building out connectivity to any target system, database or application and converting counterparty data into formats for any downstream system,” Shankar said.

First Derivatives, a capital markets data and trading systems consultancy, has developed a Reference Data Management Solutions toolkit that enables clients to provide solutions to end consumers in the method in which they want to consume and use the data.

“The success of a reference data implementation will be measured primarily on its ability to adopt and retain data consumers across the enterprise,” said Bob Wolfert, managing director at First Derivatives.

First Derivatives’ Delta Data Factory (DDF) is a hosted data management service that provides targeted reference data processing as a utility to buyers and sellers of data.

“DDF provides the ability to transform, consolidate and repackage data across a vast array of business information, enabling a firm to consolidate multiple information assets more quickly and efficiently,” said Wolfert.

Data vendors and publishers, as well as financial institutions, use DDF to outsource the processing and normalization of multiple in-bound reference data sources into enterprise data management or proprietary security master environments, Wolfert said.

Thomson Reuters selected DDF as a managed service “data formatting factory” to assist in its strategy to offer clients speedy integration and adoption of reference and pricing data, according to Tim Rice, managing director of global pricing and reference data at Thomson Reuters.

In the post-Madoff and post-Lehman Brothers era, regulators have shown a keen interest in coming up with a system of global legal entity identifiers (LEIs), so that they can gain a complete view of a firm’s exposure by counterparty.

The LEI Initiative hopes to provide an international standard for linking legal entities and portfolios under management.

“Emerging standards such as LEI have the potential to help ensure consistency and aid transparency, thus enabling firms to assess risk more quickly, speed up decision making and service customers more efficiently,” said Phil Lynch, chief executive of financial software provider Asset Control.

The need for a global LEI system to replace the patchwork of domestic systems in place is acute.

“Many markets have local identifiers which work well on a domestic basis,” said Tony Freeman, executive director of industry relations at Omgeo, a provider of post-trade processing services. “However, while the proposal to create a global LEI may be duplicative with exiting domestic market programs, it’s essential for a global system to be launched.”

Related articles

  1. Outlook 2016: Alexander Lehmann, LSEG

    The exchange is picking up the pace of migrating datasets onto the Microsoft platform.

  2. MiFID II to Boost Automation

    As settlement accelerates, firms are looking closely at their post-trade processes.

  3. FINRA has begun disseminating individual transactions in active U.S. Treasuries at the end of the day.

  4. The acquisition enhances SIX's data offering and expands its global fixed income footprint.

  5. The partnership accelerates the time-to-market for the delivery of customized solutions.