SEC Addresses Market Technology After Knight Debacle
The Securities and Exchange Commission has thrown down the gauntlet on the issue of faulty software causing massive disruptions in the markets, as has occurred in 2012 with Knight Trading’s earlier snafus and the Facebook and Bats IPOs.
Kicking off a roundtable discussion on market technology at the SEC headquarters in Washington this week, the SEC chairman Mary Schapiro said that “the stability of our securities markets is tied to the technological infrastructure of those markets”.
Following the May 2010 ‘Flash Crash’, regulators and self-regulatory organizations undertook measures to address weakness in the financial infrastructure, such as single-stock and market-wide circuit breakers, elimination of stub quotes and mandatory pre-trade risk checks.
But as the events of this year have made clear, technology has the potential to outstrip existing regulatory safeguards.
“We have witnessed the evolution of equities from the use of pencil and paper to a sophisticated web of servers that power broker internalization, ATSs [alternative trading systems] and exchanges,” said Lou Pastina, head of operations at the New York Stock Exchange, at the roundtable.
The Federal Reserve Bank of Chicago issued a report last month on keeping markets safe in the era of high-speed trading, and said these could have helped mitigate the losses experienced as a result of the Knight situation, when the U.S. market maker was left close to bankruptcy last month following a glitch that caused erratic trading activity and left the firm with billions of dollars of unwanted securities.
They include limits on the number of orders than can be sent to an exchange within a specified period of time, intraday position limits that could stop trading at one or more levels, profit-and-loss limits and a “kill switch” that could stop trading at one or more levels.
Pastina advocated the establishment of industry-wide best practices as well as the adoption of new technology, such as a kill switch.
“Given the complexity and interconnectedness of trading technology, the industry should adopt standard best practices for trading technology, covering system development lifecycles, standards for coding, implementing metrics and standard test environments,” Pastina said.
The kill switch, said Pastina, could be designed at the exchange level to halt trading activity by a given broker if that activity exceeds a predefined threshold based on trading volume.
An industry working group of self-regulatory organizations, broker-dealers and buy-side firms has advocated these and other measures in a September 28 comment letter sent to the SEC.
At the roundtable, Schapiro drew a distinction between market structure and market infrastructure, and noted that the two are intertwined.
“There are two basic concerns we need to focus on that are highly interrelated,” she said. “These are first, the structure of our markets, such as multiple execution venues, the presence of high frequency trading, dark pools and the like, and second, the infrastructure of our markets, as in the technology that undergirds trading activity.”
As the Knight Capital situation demonstrated, market disruptions can be traced as much to poor quality control as to high-frequency trading strategies run amok.
Knight had just installed trading software that was intended to send orders to the NYSE’s new Retail Liquidity Program, Schapiro noted.
As the market data that morning revealed, the software did not create patterns of rapid orders and cancels. Rather, the data showed a massive amount of orders resulting in executed trades that caused Knight Capital to accumulate significant and unwanted positions.
“This type of problem, as with the IPO mishaps, was the result of basic technology 101 issues,” Schapiro said.
Such events demonstrate that core infrastructure and technology issues can be problematic regardless of market structure.
However, Schapiro said, it is important to recognize how the overall structure of our markets can affect how our infrastructure is designed and implemented.
“For example, we have a very competitive market environment in which rapid innovation and speed-to-market may compete with diligent testing and validation of the technologies that support such innovation,” she said.
Regulators have long been aware of the vulnerability of financial infrastructure to external threats, such as cyber-terrorism, as well as threats from within.
In the U.S., financial market utilities that handle post-trade processes are subject to regulatory scrutiny under a rule adopted by the Financial Stability Oversight Council (FSOC).
The FSOC was created to fulfill statutory obligations under the Dodd-Frank Act to assess the threat, failure or disruption a financial market utility (FMU) may pose to the stability of the U.S. financial system.
Under FinReg, in making a determination on whether a FMU is systemically important, the FSOC must consider the aggregate monetary value of transactions processed by the utility, the aggregate exposure of the financial market utility to its counterparties and the relationship, interdependencies or other interactions of the financial market utility with other financial market utilities.
Once an FMU has been tagged as systemically important, it will fall under the jurisdiction of the Federal Reserve, the SEC and the Commodity Futures Trading Commission, which are authorized to prescribe risk management standards related to the payment, clearing and settlement activities of systemically important FMUs.
Algorithms have become more prevalent in the spot FX market.
QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.
Breaking data silos is key to deploying automation beyond 'nuisance' orders.
They can be used on quantum hardware expected to be available in 5 to 10 years.
Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.