06.06.2019
By Terry Flanagan

Equity Market Structure Debated (Correct)

(Corrects attribution in 6th and 7th paragraphs.)

Market data, trading regulation and liquidity were among topics discussed Thursday on the U.S. Market Structure Roundtable at the Sandler O’Neill Global Exchange and Brokerage Conference.  

Data covered whether and how the Securities Industry Processor (SIP) was still useful, and the fairness of exchange data feed pricing.

Enrico Cacciatore, Voya IM

Enrico Cacciatore, Head of Market Structure & Trading Analytics at Voya, expressed the common view that as a ‘top-of-book’ utility offering, SIP isn’t good enough for today’s institutional traders.  

“SIP is a proxy,” Cacciatore said. “But if I’m trading with algos, I need depth of book and much more breadth of information. The only way you’ll get that is from direct feeds.”

Joe Mecane, Head of Citadel Execution Services, said the data issue can be looked from three angles: business, or what type of data does a trader need to trade; regulatory, or what is required and protected by regulation; and best execution.  

Regulation National Market Structure, which was established in 2005, “decided to stick with the SIP, but they didn’t see what was coming,” in terms of market evolution, said Brett Redfearn, Director, Division of Trading & Markets at the U.S. Securities and Exchange Commission. “SIP has been relegated to eyeballs and backup systems…Smaller brokers are having a hard time competing” using SIP, and they may not have the resources to buy exchange data feeds. 

Joe Mecane, Citadel

The industry needs to redefine SIP’s role in the market ecosystem. “What is SIP used for and what do we want it to be used for,” queried Joe Mecane, Head of Citadel Execution Services. “We have to answer those questions first before asking what it should look like.”

Exchanges and brokers have battled over data fees for years. Brokers say the fees are usurious and the exchanges behave monopolistically; exchanges say fee increases reflect their own higher technology costs and market participants can choose whether to buy or not.

Last month, the SEC released guidance urging exchanges to provide more justification for their fees.

Tom Wittman, Nasdaq

Tom Wittman, Head of Global Trading & Market Services at Nasdaq, said there is “tremendous competition on fees,” and the SEC guidance “feels a bit like overreaching because of the competition.”

Cacciatore said there needs to be a more expansive two-way dialogue, in which exchanges offer more transparency into their data fees and fee structures, and buy-side and sell-side market participants in turn provide more granularity into their data utilization. “Transparency is key to understand what is the need for data, how do we utilize it and what is a fair cost.”

Mecane suggested that rather than exchanges announcing fee increases effective immediately, they instead put it out for public comment first, as regulators do with their rule changes. Wittman said exchanges typically put out fee changes on the last day of the month effective on the first day of the next month, so that rivals don’t do the same thing. That reflects strong competition in data fees, Wittman said.

Numbers showing exchange data fees increasing by 1000% or 2000% over the years are misleading, at least in Nasdaq’s case, Wittman said. “When you look at our revenue, it’s because of growth in new subscribers and new initiatives, not us increasing the base rate,” he said.

Brett Redfearn, SEC

Redfearn said data topics the SEC is looking at include what is core data, whether SIP data is good enough for traders, and whether exchanges should break out margins of specific business lines.

Aside from the SIP, “where else do I get exchange data and at what point are competitive forces in play?” Redfearn said. “Those are questions for policy makers.”

In March, the SEC put its transaction fee pilot on hold amid court challenges from exchanges. First proposed early last year, the plan would subject stock exchange transaction fee pricing, including “maker-taker” fee-and-rebate pricing models, to new temporary pricing restrictions across three test groups

Wittman called the pilot a “dangerous experiment” whose unintended consequences would included impaired liquidity for certain companies. Redfearn the plan was for the program to result an improved data-driven policy. “We are trying to do the most reasonable and rational way forward in this scenario,” Redfearn said.

There was some agreement among panelists on how to boost liquidity in thinly traded stocks, with a popular suggestion being to aggregate trading on one venue. “There are a boatload of stocks that do not trade very much — why should they trade on 13 exchanges in pennies and microseconds?” Redfearn said.

Related articles

  1. Regulators are reviewing the cost of data to market participants.

  2. There is a need to standardise ESG data.

  3. Derivatives Service Bureau is creating the role of a chief information security officer.

  4. FCA Highlights Poor Data Quality

    Solve Advisors says demand for data is growing as fixed income markets become more electronic.

  5. Alerts estimate the likelihood that a corporate security will undergo a significant change in credit spreads.