Next-Generation Data Centers
The future of the data center in financial markets will revolve as much around connectivity as housing data and applications—with co-location, low latency and financial ecosystems set to become mantras throughout the first quarter of the 21st century.
“Next-generation data centers are about being able to connect rapidly to everyone else that is in the ecosystem,” said Justin Llewellyn-Jones, chief operating officer of trading technology firm Fidessa’s U.S. operations.
“Old school data centers were about warehousing, with no awareness of who else was in that data center,” he said. “The next generation of data centers are trying to foster the concept of internalization. They are attempting to make cross-connects and form a type of marketplace for industry segments, thus evolving from simply an infrastructure provider to a more business-to-business environment.”
A critical mass of exchanges, alternative trading systems, sell-side and buy-side firms, market data vendors, clearing houses, as well as technology and connectivity vendors are coagulating within data centers to form ‘financial ecosystems’.
The trend holds implications for all capital markets sectors.
“The last generation of data centers focused on saving money per square foot to deliver continued reductions in operating costs,” said Mark Akass, chief technology officer at BT Global Banking & Financial Services, a communications provider. “The current generation of data centers is really quite good. They have massive amounts of bandwidth going in and out and within.”
Power consumption will also come down. “Kit runs hotter, reducing the need for cooling which helps save power and reduce carbon footprints,” Akass said. “In terms of next-generation data centers, their resilience model will leverage communications infrastructure better, using high-speed, low-latency communications to link physically separate locations to give virtual resilience.”
Next-generation data centers are high-density with significant capacity to add power and cooling for years to come.
“These data centers have the capability to run far more equipment and storage in less space than before,” said Craig Mohan, managing director of co-location, data center services, at exchange operator CME Group.
“High-density data centers also enable firms to analyze significantly more data in real-time than previously possible,” he said. “These high-density data centers will evolve beyond space and power to provide shared infrastructure and purpose-built, trading infrastructure and data services to reduce customer cost.”
Data centers are the central points for the infrastructure to process market data and are strategically positioned near fiber backbones. They also can be approximate to the raw feed sources themselves, which reduces points of failure, lowers cost and improves quality.
In conjunction with field-programmable gate array (FPGA) and other forms of hardware acceleration, open source data storage systems such as Hadoop and Cassandra can process, store and trigger actions based on a high-volume real-time event stream, perform analytics on historical data and update models directly into the application.
“Next-gen data centers will easily handle purpose-built, specialized hardware to process and analyze the ever expanding amount of data being created,” Mohan said.
Data volumes associated with capital markets have undergone explosive growth in recent years from traditional sources—namely stock data—by churning out exponentially higher volumes of new forms of structured and unstructured data.
Much of this data must be centralized and stored in ways that make it instantly available, which has made the data center a mission-critical component of capital markets.
“Next generation hardware in next-generation data centers will dramatically increase the sheer volume of computation and communication a data center can accomplish while reducing the energy requirement for a given unit of work,” said Michael Blundin, general manager of managed services at Vestmark, a provider of investment management software and services.
“This means broader, deeper analysis of huge volumes of data in real-time or near real-time and decreasing costs for services equivalent to today’s.”
Next-generation data centers will thus become ‘greener’ by optimizing the delivery of power and cooling to the systems.
“They support scale-in architectures that enable greater concentration of compute and data in a smaller space,” said Philip Enness, director of markets infrastructure at technology firm IBM.
Next-generation data centers will support hybrid architectures, enabling different technologies to be optimized to support specific types of workloads.
“They will house integrated systems that drive down support and integration costs,” Enness said. “And they are more agile and flexible in supporting applications and business initiatives that enable firms to get closer to their clients, optimize the trade process and re-engineer them for profitable growth.”
Next-generation data centers will also continue to utilize virtualization.
“Server virtualization is already mainstream within today’s data centers,” said Kevin Ressler, director of global product management, enterprise networks at TE Connectivity, a manufacturer of networking components. “Next-generation data centers will enable servers, network devices and storage devices to be managed as pools of resources performing at higher utilization than ever before, using more energy-efficient electronics.”
Data centers “will be better at enabling private cloud combinations of physically separate enterprise-owned and independent data centers to function as a cohesive whole”, Ressler said. “Next-generation data centers will also enable firms to turn time, distance and geography into opportunities and trade spaces instead of limitations.”
Next-generation data centers “will be friendlier to both higher-density hardware solutions as well as easily reachable virtual solutions”, said Brad O’Brien, vice president of development at CFN Services, a provider of managed automated trading enablement services.
“Requirements for more complex and faster data processing with fluctuating demand requirements are the drivers for combining private hardware-based solutions with virtual solutions used to address the peaks in demand,” O’Brien said. “So, not only will data centers need to remain competitive in their space and power offerings, but they will need to bundle in network solutions to be able to tie into resources, physical or virtual, into other locations.”
Data extraction and integration is the second stage of a digitization process.
Financial Instrument Global Identifier enables consistency through trade lifecycle and across institutions.
Spending on ESG data has an annual growth rate of 20%.
Increased electronification has created useable and accessible real-time and historic trade data.
The GIPS standards, created by CFA Institute, are for calculating and presenting investment performance.