04.29.2013

Decimalization and Data

04.29.2013
Terry Flanagan

While the Securities and Exchange Commission mulls a pilot program to widen the minimum quoting variation for shares of small and mid-sized companies, the role that data management plays in capital markets is being scrutinized.

Regulatory changes, including decimalization and Regulation NMS, have succeeded in strengthening competition by narrowing spreads, especially in large cap stocks.

However, the current market structure may be suboptimal for trading in shares of smaller companies.

“We support a focused pilot program that scientifically evaluates the impact of widening the minimum quote variations for small and mid-cap stocks,” said David Weisberger, executive principal of Two Sigma Securities, in an April 23 comment letter.

Various methods have been described for measuring the percentage of market capitalization traded and the limited liquidity provided in small cap stocks.

“The SEC should employ a data-driven approach, and determine the scope and course of the pilot program based only on whether there are measurable benefits to liquidity, efficiency, competition and capital formation,” said Weisberger.

On February 5, the SEC hosted a roundtable discussion regarding the impact that decimal-based pricing increments (i.e., decimalization), which replaced fraction-based tick sizes in 2001, has had on U.S. securities markets, investors, issuers and market professionals.

Panelists discussed the merits and design of a potential SEC pilot program intended to establish an empirical basis for action, if any, that the SEC may take with respect to tick sizes. While SEC staff members indicated that no final decision had been made to move forward with a pilot program, participants in the roundtable generally favored the concept of such a program.

Paul Jiganti, managing director, market structure, TD Ameritrade

Paul Jiganti, managing director, market structure, TD Ameritrade

“I look at this potential pilot program as a great data source and the ability to go forward and make some real judgments of how the market could look going forward,” said Paul Jiganti, managing director in charge of market structure at TD Ameritrade, at the roundtable.

The role played by data has increased in response to regulatory and technological changes.

Providers of enterprise data are providing the framework for capital markets firms to participate in regulatory-driven initiatives such as the decimalization pilot, Large Trader Reporting, and Consolidated Audit Trail.

“We view ourselves as offering firms the capability to be compliant with their regulatory obligations,” said Stanley Young, CEO of Bloomberg Enterprise Products and Solutions.

Traditional financial data-delivery and management providers, including Bloomberg, have extracted raw data and information from exchanges, brokers and other sources and then delivered the information directly to traders.

However, Bloomberg has now become a full-service provider of enterprise-market data delivery and management, delivering capabilities that feed critical data, analysis and insight directly into enterprise information technology infrastructures.

Specifically, the objective of this new focus is to provide organizations with a way to leverage Bloomberg’s global network, information delivery and distribution tools, as well as its services.

“We are looking at how we can apply Bloomberg’s assets, such as data analytics, to tackle the business challenges our clients face in this complex world,” said Young. “Bloomberg’s assets revolve around two things: data and workflow. In addition to aggregating data, we have enable firms to distribute data, including a firm’s proprietary data, via our APIs and platforms.”

The Feb. 5 roundtable was part of the SEC’s response to Section 106(b) of the Jumpstart Our Business Startups Act, which requires the SEC to study and report on the impact that decimalization has had on the number of initial public offerings since its implementation, as well as the impact decimalization has had on liquidity for small- and mid-cap company securities.

The goal of improving small cap liquidity could be undermined by a pilot program that is implemented too broadly. In particular, care should be exercised against narrowing tick sizes for large cap stocks as that could significantly alter the market structure and undermine the experiment in small caps.

“A pilot program should be designed in a manner that avoids biases in the stock selection methodology for both the experimental and control groups,” Weisberger said. “We would suggest that in the experimental group, the methodology used for determining the optimal quote increment be objective and data-driven, rather than arbitrary.”

Related articles

  1. Algos, Post-Trade Top FCM Concerns

    TT Splice provides industry-first functionality for synthetic multi-leg spread trading.

  2. Algorithmic Trading Broadens Appea
    Daily Email Feature

    Trading Smarter With Algo Wheels

    Modern wheels can incorporate many different data points.

  3. Asset managers leave money on the table when using VWAP algos for low-urgency orders.

  4. The firm is leveraging its newly acquired quantitative trading expertise to generate new client algorithms.

  5. Congress Unlikely to Act on HFT

    The algo provides an alternative to VWAP for minimizing implementation shortfall.