Bank of England Invests In Data Analysis
The regulator wants to integrate data from multiple repositories and remove duplicates.
The Bank of England is investing in a new data architecture to improve the analysis of information from multiple trade repositories.
The UK central bank said in its latest Financial Stability Report yesterday that it is investing in its capability and technology to collect, process and store data to enhance its ability to query and analyse information from trade repositories.
“The new data architecture will solve some of the existing technological impediments to analysing trade repository data, most notably by combining the data available from multiple trade repositories into one integrated data set and automating the identification of duplicate copies of reported transactions,” said the report. “By improving data collection, processing and storage, the Bank will be able to analyse larger volumes of data, across multiple trade repositories and across time, significantly faster.”
After the financial crisis in 2008, the leaders of the G20 pushed for a shift of standardised derivatives from the over-the-counter market into central clearing in order to improve transparency. Standardised products would be traded on exchanges or electronic trading platforms and cleared with uncleared transactions subject to higher capital requirements and mandatory margin requirements. Details of each transaction would be reported to trade repositories.
The European Market Infrastructure Regulation, which covers central clearing, came into force in December 2014 and introduced the reporting requirements for derivatives in the region. There are now eight authorised trade repositories in the European Union.
The Bank of England said transaction-level data from repositories has increased the transparency of derivatives markets to authorities. However regulators lack a global view of derivatives markets because many authorities can only access local data.
“The Bank will work with other authorities internationally to promote the faster progress required at international level to resolve barriers to data quality, standardisation and sharing,” said the report. “In particular, no international work is currently under way to decide on how a cross-border data aggregation mechanism should work in practice.”
The regulator continued that it has already used trade repository data to support its objectives in a number of ways including monitoring activity and positioning in derivatives markets around significant market events such as in the run-up to and immediately following the United Kingdom’s referendum on EU membership last year; assessing the market impact of policy shocks such as the implications of the Swiss franc’s depeg from the euro in 2015; and understanding the structure of key derivatives markets to inform policymaking and supervisory decision-making.
In addition the Bank’s upcoming in-depth assessment of the role of leverage in the non-bank financial system will draw heavily on repository data to analyse non-banks’ use of derivatives.
The Bank continued that the increase central clearing has reduced the number and size of derivatives exposures generated by a given trading volume, and made the system more resilient under stress.
“There has been a marked increase in rates of central clearing in some of the largest asset classes underlying OTC derivatives,” said the report. “For example, the minimum percentage of outstanding single-currency OTC interest rate derivatives globally that are centrally cleared has increased from 24% at end-2008 to 62% at end-June 2017; for credit default swaps, it has increased from 5% at end-June 2010 to 34% at end-June 2017.”
Clarus Financial Technology, the derivatives analytics provider, said in a blog today that analysis of US swap data repositories suggests that uncleared markets see little new trading activity.
“Notional outstanding of uncleared trades has been constant for the past year,” added Clarus.
In addition, the gross market value of both cleared and uncleared positions has decreased by around 40% in the past year.
“These gross market values have moved hand-in-hand between both cleared and uncleared positions,” said Clarus. “Compression still appears to have a lot of work to do in uncleared markets to shrink legacy portfolios.”
Compression is a process in which clients can “tear-up” offsetting trades in their portfolios to reduce the notional outstanding and number of line items in their portfolio while maintaining the same risk exposure. Use of compression services has increased following the introduction of stricter capital requirements which have led to banks reducing their balance sheets and capital efficiency becoming increasingly important.
Writing essays to support trading decisions "is not ideal."
Lack of a consolidated tape and cost of market data were raised.
Innovation in data analytics is as important as advances in trading technologies.
FIX Orchestra sets machine readable rules of engagement.
Common standards, akin to the FIX protocol, are indicated as a next stop for the industry.