Addressing the Complex Needs of Modern Electronic Trading (by Aaron Wald, Vela Trading Technologies)
In today’s electronic trading environment, market participants, both buyside and sellside, are facing challenges on a number of fronts. On one front is an ever-changing and increasingly onerous regulatory landscape. Proposed new rules such as RegAT in the US and sections of MiFID II in Europe for example, will place a much greater burden on firms to ensure that their trading systems cannot disrupt the market. Market participants engaged in any kind of automated, algorithmic or electronic trading will need to consider how these new regulations will impact them and what actions they need to take in order to stay compliant. On another front is the increased complexity of both the markets and the technology that underpins them.
Markets are becoming more and more interconnected, with the number of instruments and asset classes traded electronically across multiple venues continuing to grow. Under this environment, firms face the combined challenge of how to process massive amounts of market data, how to connect and route their orders to the appropriate venues, and how to make best use of software, hardware, and network connectivity to do it all in a timely manner. On top of the regulatory and market complexity challenges, firms need to consider how to stay competitive and profitable in an environment where margins are becoming thinner and thinner, particularly in the low-latency trading space. Gone are the days where small reductions in latency could lead to massive increases in revenues. In today’s electronic markets – where low latency is a pre-requisite to doing business – firms have to look beyond pure speed in order to gain a competitive edge. So what steps can firms take to address these challenges?
With much of the new regulations revolving around greater transparency, the onus is on banks and trading firms to provide regulators with deeper levels of data, not only around their trading activities, but also around their trading systems. Today, it is not enough for firms to be fast. In order to stay compliant, they also need to be transparent. This means that trading systems should be self-reporting and provide accurate and meaningful statistics around areas such as latency monitoring, for example. If a firm is asked by the regulator to report on its true latencies, there should be the ability to identify not only where latency exists – whether it is in the software, the hardware or the network – but also under what conditions that latency changes.
Another key area that the regulators are focusing on is risk of market disruption. Upcoming regulations such as RegAT and MiFID II will go way beyond the pre-trade risk controls that were introduced a few years ago with SEC rule 15c3-5 in the US and the equivalent ESMA guidelines in Europe. Under the proposed legislation, many firms will now need to undertake substantial reviews of their systems and processes to ensure – and to prove to the regulator – that nothing they do will disrupt the market. This challenge can be addressed by putting in place best practices around software development and ensuring the right levels of regression testing and release certification are followed. For example, with every exchange-driven change (EDC) that triggers a software release, clients of trading system vendors will want those vendors to explain exactly what is changing in that release and to prove that their regression testing procedures are sufficiently robust to not introduce any bug – either around the EDC itself or elsewhere in the system – that will have a negative impact on the market.
One unintended consequence of all these new regulations is that markets are becoming more and more complex, and market participants – already under pressure to be fast – now have to process and act upon ever increasing amounts of data in ever shorter time periods. This means that firms are continually looking for new and innovative technology solutions to help them navigate through this complexity. The problem however, is that as the markets become more complex, much of the software and hardware surrounding the markets becomes more complex too, arguably leading to a situation where the markets are less safe, which is definitely not what the regulator intended. So in order to help, rather than hinder firms in dealing with complexity, how can the various components of a trading solution best work together in an efficient, reliable and cost-effective way? The answer lies in ensuring that the software and hardware underpinning the trading systems are working in harmony – the renowned high performance computing and trading system expert Martin Thompson describes this as ‘Mechanical Sympathy’. Hardware today is continually evolving in a way that CPUs are making more use of multiple cores and complex memory architectures. The latest Intel Skylake chips for example will even be shipped with an integrated FPGA (Field Programmable Gate Array) in the future. So the software running on these machines needs to really understand what the underlying components are doing in order to take advantage of all these characteristics. This means that in order to address complexities in market infrastructure, trading software needs to be intelligently engineered to work in harmony with the underlying hardware, so that the application is getting the most out of the machine in terms of latency, throughput, and access. Rolling out EDCs on heterogeneous chips is a much more viable proposition than programming regression testing and certifying such changes on pure FPGA-based systems, for example.
The firms who are able to handle this complexity are the ones who will gain a significant advantage going forward. Particularly in a multi-asset trading environment, where it is necessary to process large data sets and connect to multiple markets. In the equities and exchange-traded derivatives space, where electronic trading is now relatively mature, margins today are thin. A few years ago, high frequency trading (HFT) firms could gain significant advantages by shaving a few fractions of a second off their order flow, but those opportunities are now few and far between, so firms have to look beyond pure speed to stay competitive. This means that they are increasingly adopting multi-asset trading strategies, in instruments such as fixed income and FX, for example.
However, this brings about new challenges in terms of sourcing and processing the requisite market data for these instruments in an integrated, consolidated, and normalised way. Buyside firms who are able to process this data efficiently, and the sellside firms able to support those activities at scale, will steal a march on their competitors and continue to do well. Looking ahead, it is likely that both buy and sellside firms will make increasing use of the efficiency of Cloud for aspects of their electronic trading ecosystems applications that are not latency sensitive. We are already seeing this with the use of streaming market data from RESTful APIs for example, to feed lower tier non-trading applications such as dynamic risk spreadsheets. In this regard, much of the messaging and middleware technology that has been implemented across banks within the last couple of decades could easily migrate to the Cloud.
With Cloud-based infrastructures designed to offer both simplicity and scale, and as people come to realise that many of their security fears around the Cloud are unfounded, firms will increasingly use such Cloud-based solutions, whether in a Public Cloud, Private Cloud, or Hybrid Cloud-type configuration. Connecting it all together With the electronic trading industry facing a number of pressures – from increased regulation, increased complexity, and shrinking margins – staying ahead of the competition will certainly present a challenge for both buyside and sellside firms in the coming months and years. Consolidating their vendor relationships will positively help firms address these challenges, particularly where they are looking to rationalise costs across asset classes and move from a heavily siloed to a more integrated electronic trading environment.
Firms can significantly reduce both complexity and costs by reducing the number of vendors they use for areas such as market data, enterprise distribution and market access gateways. Vendors who understand high performance trading and offer the latest market data technology, and who can provide the breadth of coverage and depth of expertise demanded by today’s multiasset, multi-region electronic trading marketplace, will enable banks, brokers, and trading firms to stay focused on their business by bringing both efficiency and simplicity to their trading infrastructure
With Adam Conn, Head of Trading, Baillie Gifford
With Ankit Mittal, Business Change Manager, Global Trading, Schroders
FIX EMEA Trading conference panelists cite regulation, ESG, and data and analytics as priorities.
With Alison Hollingshead, Chief of Staff, Trading Platform & Core Technology, and Sam Ratcliff, Head of Execut...
Valéry Derbaudrenghien of Robeco works to optimize workflow while reducing operational risk.