Buzz Around Big Data
Big Data is capturing the attention of traders as 2014 is about to get underway.
“In the last few years we have seen Big Data generate a lot of buzz along with the launch of several successful big data products,” said Gagan Mehra, chief evangelist of Software AG, in a blog posting. “The big data ecosystem has now reached a tipping point where the basic infrastructural capabilities for supporting big data challenges and opportunities are easily available. Now we are entering what I would call the next generation of big data — big data 2.0 — where the focus is on three key areas: speed, data quality and applications.”
Software AG has been putting a lot of effort, not just in Big Data, but fast Big Data. “When people talk about Big Data, they often talk about Hadoop, but that’s not real-time; that’s dividing problems into chunks,” said Dr. John Bates, chief technology officer of Software AG. “It’s a great technology, but it’s batch-oriented.”
Capital markets is the original Big Data: market data, trade data, quotes, news, other forms of data for surveillance, risk, or trading algorithms. “It’s Big Data in motion. It’s in-memory technologies that can handle the messaging rates, storage, continuous and historical analysis, and being able to visualize those,” Bates said. “It’s all about in-memory technology, but real-time Big Data, with the emphasis on not just big but fast.”
Data is growing at an exponential rate, and the ability to analyze it faster is more important than ever. Almost every big data vendor is coming out with product offerings, like in-memory processing to process data faster.
The speed at which decisions are made has already reached a point where the human brain can’t keep up, Mehra said. This means that based on defined rules, data is cleansed and processed and decisions are made, all without any human intervention. In such environments, a single stream of bad data can act as a virus and result in incorrect decisions or heavy financial loss.
“A good example is the world of algorithmic trading, where trades are placed every few milliseconds by analyzing stock market trends using algorithms versus a human.,” said Mehra.
Machine learning is another technique that is being used to improve data quality. It has made it easier to conduct pattern analysis to identify new data quality issues. Machine learning systems can be deployed in a closed loop environment where the data quality rules are refined as new quality issues are identified via pattern analysis and other techniques, Mehra said.
Another area that’s very exciting and will see a big expansion in 2014 is the challenge that traders face in detecting new patterns on which algorithms can act. “It’s great if you have algorithms that can act on patterns, but how do you detect new patterns?” said Bates. “How do you know what to trade in this ever-complex marketplace?”
Some of Software AG’s customers are working on self-evolving algorithms, applying genetic approaches. “I’ve been talking to customers that have systems that can intelligently find new patterns and recommend them and start to mine them,” Bates said. “This might involve thousands of different parallel algorithms, each with slightly different parameters, finding the ones that are most profitable, and swapping them into the market. Self-evolving, more intelligent algorithms is something we are going to see more of, not just in 2014 but beyond.”
Algorithms have become more prevalent in the spot FX market.
QB’s Algo Suite for futures market trade execution is also being co-located to HKEX.
Breaking data silos is key to deploying automation beyond 'nuisance' orders.
They can be used on quantum hardware expected to be available in 5 to 10 years.
Streaming blocks change the basis of matching and price discovery so institutions can find new liquidity.