Automation Glitch-Testing Needed

Terry Flanagan

The silver lining to the recent spate of technology-related trading halts and snafus could be lessons provided regarding systems reliance, as well as stronger checks and balances.

System changes not rationalized across the entire IT enterprise often expose glitches, according to Jeff Rauscher, a solutions designer for Redwood Software in Dallas.

“A lot of times, failings are by omission,” when market participants and operators such as central clearing houses, brokerage firms, and banks “do not test regularly for what could go wrong, and have not validated their environments,” with Enterprise Process Automation, Rauscher said.

The size, scale and sophistication of configurations make it exceedingly hard to manage and monitor EPA testing. “It’s a population of one in terms of who else has this specific configuration,” he noted, and analytics are needed to understand what could go bad and what needs to be done when overload or other problematic conditions occur.

Big market players have spent millions of dollars to have high-availability networks, but “it is not a matter of how much is spent or how much technology is in place, it’s the approach to it that matters,” Rauscher said.

Redwood uses the Carnegie Mellon Maturity Model to guide process improvement for clients to assess where the software manager wants to be, relating efforts directly to the maturity continuum state to manage processes systematically.

“Learning what a customer’s next steps are is to learn their culture,” Rauscher told Markets Media. For example, an unorganized client might respond quickly but not consistently. Redwood develops repeatable and accurate responses to market changes, business changes, or technical issues, and orchestrates processes for better results, he said.

In recent months, Nasdaq OMX and Goldman Sachs have had system outages, but all market operators and trade handlers  have occasional technology issues.

With such outages, “it boils down to they missed something in their approach and methodology — this is where the capability model gives a structure for fully optimized process,” noted Rauscher.

“When more transactions are coming across your connections than you ever anticipated, has the testing been done to see what threshold would cause the system to fail?” asked Rauscher.

In terms of comprehensive EPA testing across the entire U.S. securities industry, some blame the Securities and Exchange Commission for the chaotic market structure where more than a dozen stock and options exchanges compete yet are interconnected, causing burgeoning data feeds. CME Group, by contrast, has averted outages more efficiently amid a less fragmented futures market.

SEC Chairwoman Mary Jo White convened leaders at U.S. stock exchanges last week to identify several concrete measures for improving robustness and resilience of individual and interdependent market systems.  Nasdaq’s August trading interruption “should reinforce our collective commitment to addressing technological vulnerabilities of exchanges and other market participants,” she stated.

One of the most primary challenges for any vendor that might offer solutions is the fact that the operator’s don’t like making changes, Rauscher noted, but “we are all familiar with Murphy’s Law, and should not assume Murphy is not going to hit us again.”

“Spend time learning from failures,” he continued. “What is the worst-case scenario that can take us out? If we don’t push the threshold to test these limits, we’re behind, (and) it’s too late to go back in and turn the wheel to improve the process.”

Related articles