Partnership Aims to Offer Speedy Big Data Management
Financial applications that depend on moving data across large distances are being coupled with specialized messaging technology to improve performance.
Tervela, a ‘data fabric’ provider, is working with Teradata, an analytic data solutions company, to rapidly acquire large volumes of data from distributed data producers, to then transport and input it into a Teradata data warehouse or Teradata data appliance for business analysis.
“There is a move away from sequential, batch and overnight processing to parallel, continuous and real-time processing,” said Barry Thompson, chief technology officer at Tervela.
The objective is to acquire and distribute data in real-time, accelerate loading and time to querying and disseminate results globally in real-time.
“We are seeing demand for high performance data movement in capital markets and other verticals” said Ben Gillis, area vice-president at Teradata. “Our customers are demanding faster time-to-query, especially for big data applications, to support an explosion of analytical systems. Tervela’s integration to Teradata will help our customers meet the needs of their consumers.”
While traditional databases driven by local or even distributed applications serve as good repositories, there needs to be a high-performance connective tissue facilitating high speed data collection concurrently across disparate sources feeding into the “super analytics engine” (centrally located for ideal efficacy).
“We are seeing this requirement across all of our customers, especially those belonging to the financial services and intelligence industries, who are essentially trying to solve the same problem,” said Thompson at Tervela.
While ‘big data’ analytics platforms, which aim to analyze data sets that have grown so large they are awkward to handle, such as Hadoop are adequate for data storage, organization and general retrieval, “they only meet high-performance requirements if they have local access to the data, and that’s not always the case”, Thompson said.
Geographically dispersed high-speed trading and regulatory compliance are driving the adoption of data fabric technology.
The data fabric provides intelligent routing services to make sure that data gets where it needs to be quickly and efficiently.
“We don’t manufacture analytics platforms, but we do make them faster and higher-performance,” Thompson said.
A data fabric is a logical data layer that overlays a company’s existing networks and data sources including databases or cloud services. It eliminates the need for application developers to write their own layers of functionality.
Customers overlay Tervela’s hardware and software appliances on top of existing feeds, databases and cloud services to build high-performing distributed applications.
“A data fabric provides high-performing, no-loss access to data,” Thompson said. “Most, if not all, applications in financial markets require this functionality in order to distribute data efficiently and securely across the globe.”
The data fabric itself provides intelligent routing services to make sure that data gets where it needs to be quickly and efficiently.
“The fabric knows who is producing it and who needs to consume it, and can adjust internal networking routes in real-time to make sure that data is put on just those routes that lead to authorized consumers,” said Thompson.
Institutional investors are increasingly considering opportunities in the digital asset class.
The consolidated quote system for corporate bonds has raised funds to expand outside the US.
SEC's proposed rule could result in dissemination of incomplete, inaccurate and misleading data.
SEC requires a review of data on non-listed securities before initiating or resuming quotes.
Broker-dealers will be able to meet the new SEC requirements.