‘Compute’ Shadows IoT Data

Terry Flanagan

Financial services has situated computational power near data generation for years now, ever since high-speed electronic traders saw the advantage of having their racks co-located with exchange matching engines.

The rise of the Internet of Things may spur the next migration of ‘compute’. The premise is that this data is generated mostly in and around population centers; firms that seek to mine this data — sourced from parking spaces and refrigerators, streetlights and factory floors — will be inclined to have physical proximity to these areas.

Bill Fenick, Interxion

“This is a normal methodology in trading,” said Bill Fenick, VP of Enterprise at Interxion, a provider of European colocation data-centre services​. “Now we’re seeing this writ large with everything else. Trying to gain an edge on data is not going to be an exclusivity for financial services.”

There will be 20.4 billion internet-connected ‘things’ in use in 2020, almost two-and-a-half times the current 8.4 billion number, consultancy Gartner has estimated. As far as how to benefit from the technology, analytics and data science platforms topped the technological wish list for IoT initiatives, according to a Radiant Advisors / Unisphere Research survey.

As IoT is an emerging phenomenon, the migratory pattern of the computational power that processes IoT data remains to be seen.

“Let the economic forces decide” whether compute moves to the data, said Theodoros Evgeniou, a Fontainebleau, France-based professor of decision sciences and technology management at business school INSEAD. “If there is a need for, or value to be extracted from, lower latency, yes. If not, no.”

Theos Evgeniou, INSEAD

Evgeniou said  there may be time-critical IoT applications related potentially to health, security, traffic, and logistics, but they need to be proven. “Somebody must be willing to pay to invest, much like the banks invested in high-speed networks,” Evgeniou said. “To invest you must expect you will make a profit later on, from the customers. The customers are the ultimate payers, and they will pay only if they see value.”

The projected explosion in IoT data will be driven by sensors installed on industrial operating equipment and on city infrastructure, said Praas Chaudhuri, co-founder and principal analyst for industrial IoT and smart cities at San Francisco-based ArcInsight Partners.

“The decision to increasingly locate computational power near the edge, closer to operating equipment, allows for rapid execution of mission-critical control and performance decisions, which key for situations that demand low-latency tolerance,” Chaudhuri said.

“On the other hand, in many situations such as in smart cities and industrial asset management contexts, decisions required are more strategic and infrequent in nature,” Chaudhuri continued. “It makes more economic sense there to invest in infrastructure that supports aggregation of low-resolution data and manage computational power in the cloud.”

Praas Chaudhuri, ArcInsight Partners

Companies’ challenge is to determine where to put their algorithm or artificial-intelligence application to be best-positioned to mine the rich unstructured data valleys generated by IoT, as well as other content such as tweets that also emanate mainly from population centers, said Interxion’s Fenick.

A financial-services pioneer in moving compute close to the data was Tradebot Systems, which in the early 2000s moved its computers from a storefront in a Kansas City suburb to right near New York City to be close to the trade-matching engines of the NYSE and Nasdaq exchanges. The move cut the time to trade a stock from 20/1,000 of a second to 1/1,000 of a second, according to a 2006 Wall Street Journal article.

Tradebot’s CEO was Dave Cummings, who in 2005 founded Bats Global Markets with a similar locational blueprint: the exchange operator’s corporate headquarters was in Lenexa, Kansas, but its matching engine was placed 1,200 miles to the east in Northern New Jersey, just across the Hudson River from the concrete canyons of Wall Street and the bank broker-dealers in midtown Manhattan.

By 2009, banks and trading firms co-locating compute near exchanges’ data and matching engine was prevalent enough that NYSE pre-sold more than 20,000 feet of space in its new NJ data center. For competitive reasons, all the big sell-side banks effectively had to have a presence in exchange data centers, which play the role of modern-day trading floors in terms of information discovery.

“Now we are seeing this same phenomenon happen across industries,” Fenick said. “It’s the laws of physics, the same methodology that’s used in low-latency trading — who can get to the data first, to do something productive with it. Proximity matters.”

Related articles

  1. Securities firms need more juice to power more and faster screen-based transactions.

  2. Connectivity provider shows MM writer around its #LON3 data center.

  3. OTC Link ATS To Monitor Latency
    From The Markets

    Pico Extends Co-Lo to Milan

    The ultra-low latency service to the Borsa Italiana is the latest step in the firm's European expansion.

  4. GMAG looks at some thorny and interconnected issues regarding cloud technology.

  5. High-Touch Sales Traders Go Electronic

    Buy-side shop will leverage TT’s colocation and fiber to trade derivatives and cryptocurrencies.