Decimalization and Data
While the Securities and Exchange Commission mulls a pilot program to widen the minimum quoting variation for shares of small and mid-sized companies, the role that data management plays in capital markets is being scrutinized.
Regulatory changes, including decimalization and Regulation NMS, have succeeded in strengthening competition by narrowing spreads, especially in large cap stocks.
However, the current market structure may be suboptimal for trading in shares of smaller companies.
“We support a focused pilot program that scientifically evaluates the impact of widening the minimum quote variations for small and mid-cap stocks,” said David Weisberger, executive principal of Two Sigma Securities, in an April 23 comment letter.
Various methods have been described for measuring the percentage of market capitalization traded and the limited liquidity provided in small cap stocks.
“The SEC should employ a data-driven approach, and determine the scope and course of the pilot program based only on whether there are measurable benefits to liquidity, efficiency, competition and capital formation,” said Weisberger.
On February 5, the SEC hosted a roundtable discussion regarding the impact that decimal-based pricing increments (i.e., decimalization), which replaced fraction-based tick sizes in 2001, has had on U.S. securities markets, investors, issuers and market professionals.
Panelists discussed the merits and design of a potential SEC pilot program intended to establish an empirical basis for action, if any, that the SEC may take with respect to tick sizes. While SEC staff members indicated that no final decision had been made to move forward with a pilot program, participants in the roundtable generally favored the concept of such a program.
“I look at this potential pilot program as a great data source and the ability to go forward and make some real judgments of how the market could look going forward,” said Paul Jiganti, managing director in charge of market structure at TD Ameritrade, at the roundtable.
The role played by data has increased in response to regulatory and technological changes.
Providers of enterprise data are providing the framework for capital markets firms to participate in regulatory-driven initiatives such as the decimalization pilot, Large Trader Reporting, and Consolidated Audit Trail.
“We view ourselves as offering firms the capability to be compliant with their regulatory obligations,” said Stanley Young, CEO of Bloomberg Enterprise Products and Solutions.
Traditional financial data-delivery and management providers, including Bloomberg, have extracted raw data and information from exchanges, brokers and other sources and then delivered the information directly to traders.
However, Bloomberg has now become a full-service provider of enterprise-market data delivery and management, delivering capabilities that feed critical data, analysis and insight directly into enterprise information technology infrastructures.
Specifically, the objective of this new focus is to provide organizations with a way to leverage Bloomberg’s global network, information delivery and distribution tools, as well as its services.
“We are looking at how we can apply Bloomberg’s assets, such as data analytics, to tackle the business challenges our clients face in this complex world,” said Young. “Bloomberg’s assets revolve around two things: data and workflow. In addition to aggregating data, we have enable firms to distribute data, including a firm’s proprietary data, via our APIs and platforms.”
The Feb. 5 roundtable was part of the SEC’s response to Section 106(b) of the Jumpstart Our Business Startups Act, which requires the SEC to study and report on the impact that decimalization has had on the number of initial public offerings since its implementation, as well as the impact decimalization has had on liquidity for small- and mid-cap company securities.
The goal of improving small cap liquidity could be undermined by a pilot program that is implemented too broadly. In particular, care should be exercised against narrowing tick sizes for large cap stocks as that could significantly alter the market structure and undermine the experiment in small caps.
“A pilot program should be designed in a manner that avoids biases in the stock selection methodology for both the experimental and control groups,” Weisberger said. “We would suggest that in the experimental group, the methodology used for determining the optimal quote increment be objective and data-driven, rather than arbitrary.”
Agency broker moves beyond execution to offer a broader suite of services.
Algorithms have become more prevalent in the spot FX market.
QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.
Breaking data silos is key to deploying automation beyond 'nuisance' orders.
They can be used on quantum hardware expected to be available in 5 to 10 years.