Regulations Drive Big Data—Part II
Capital markets are being bombarded with regulations in 2012, further escalating formidable challenges posed by Big Data.
The regulations will require firms to aggregate and report on transactions covering every asset class, and will also impact the need to accurately and swiftly calculate risk exposures to counterparties on a firm-wide basis.
“In the coming months, 70 new capital markets regulations will come into effect in Europe and over 300 will come into effect in the U.S.,” said Chris Elsmore, senior vice-president at Panopticon Software, which provides data visualization software for financial companies.
“Firms must comply with the myriad provisions of Dodd-Frank, Basel III and SEC Form PF along with other regulatory requirements that are not yet on the books,” he said.
The thrust of the regulations, many of which were enacted in direct response to the financial meltdown of 2008-09, are to reduce systemic risk and provide greater transparency.
”Dodd-Frank ups the ante on information transparency in American financial services, while, later this year, insurers in the U.K. will need to comply with Retail Distribution Review and European firms will need to deal with Solvency II,” said Matt Benati, vice-president of global marketing at Attunity, a provider of software and data integration solutions.
“All of these regulations highlight the importance of data and the challenges associated with compliant access and availability,” said Benati. “The ability to furnish data in an accurate, complete and timely manner is a critical component for compliance.”
Companies have, right now, the opportunity to discuss and identify the added business values that they want to get out of this Dodd-Frank compliance exercise.
“The regulations mandate a more sophisticated approach to the collection of data from multiple systems and the ability to aggregate that data, report it out and then maintain it in a fully auditable way,” said Ian Jones, senior strategist for commodity risk at SAS RiskAdvisory. “This required level of data management opens up an opportunity for a more comprehensive, and more real-time, capacity to run all manner of analytics on this new, aggregated dataset.”
The Commodity Futures Trading Commission has put off the critical definitional phase of Dodd-Frank rulemaking, around which will determine the applicability of other rules such as position limits, protection of cleared swaps collateral and which swaps will be deemed as “made available to trade”.
“So far the [CFTC] has proposed 55 rules, and finalized about one-third of those,” said Jones. “So it’s safe to say that we are squarely into an implementation mode. There are, however, some major issue-defining regulations that still need to be addressed and that may push some implementation toward the summer of 2012.”
This has particular relevance to the energy and commodity trading segments of the capital markets, presenting both a business challenge and a technology challenge.
“The first concern for the industry is determining which category they fall into under the regulations, which defines how heavy their compliance burdens will be, and then monitoring their transactions to ensure their status doesn’t change,” said Jones.
“Then there’s the system challenge—ensuring they have adequate data collection and retention techniques in place, plus very robust reporting capabilities to respond in a timely fashion to some of the fairly core changes the new rules require.”
Together, these challenges will require a realignment of critical data collection and reporting processes that can avoid Dodd-Frank regulatory pitfalls.
“It’s important to take an inventory of all your company’s data—to know its attributes and understand the very specific fields,” added Jones.
The Federal Register now provides tables that companies can use as guides for the specific fields they are asking for. In some cases, the requisite information may reside in another system or in a downstream process, and time-stamping will be a key issue.
Companies have the opportunity to prepare “messaging appliances” to correspond with the transfers of data from trading activities to swap data repositories. The messaging appliance helps companies take into account the real-time or near-time reporting that’s required.
“It’s either going to become the domain of the middle office or the compliance department or back office, but what companies need to take into account is that this is going to be a case of managing queues of messages that will necessarily need to be provided to the Swap Data Repositories in real-time,” Jones said.
Currently, there is no single database of comprehensive and readily accessible data regarding orders and executions. Instead, each self-regulatory organization (SRO) uses its own separate audit trail systems to track information relating to orders in its respective markets.
In order to address this, the SEC has proposed a Consolidated Audit Trail (CAT). “CAT means the SEC will have the ability to see what occurred in the market, so it doesn’t take three months to analyze a Flash Crash,” said Dominic Iannacone, director of business development at enterprise software company Sybase.
The proposed rule would require every exchange and FINRA, an independent securities regulator, to provide detailed information to a newly created central repository regarding each quote and order in a national market system security, and each reportable event with respect to each quote and order.
FINRA operates the Order Audit Trail System (OATS), an integrated audit trail of order, quote and trade information for Nasdaq and OTC equity securities. FINRA uses this audit trail system to recreate events in the lifecycle of orders and more completely monitor the trading practices of member firms.
FINRA has proposed using OATS as the foundation for a CAT. However, OATS does not accommodate options data and does not support real-time trade data surveillance, Nasdaq OMX said in a comment letter.
Nasdaq OMX has proposed a CAT powered by a combination of Nasdaq’s FTEN and Smarts technology with the OATS data format.
This more advanced CAT system could be expanded to equities and options at a cost no higher than FINRA estimates for an equity-only CAT based solely on OATS, according to Nasdaq.
Data extraction and integration is the second stage of a digitization process.
Financial Instrument Global Identifier enables consistency through trade lifecycle and across institutions.
Spending on ESG data has an annual growth rate of 20%.
Increased electronification has created useable and accessible real-time and historic trade data.
The GIPS standards, created by CFA Institute, are for calculating and presenting investment performance.