HPE’s ‘The Machine’: Is this the future of computing?

Meet Hewlett Packard Enterprise’s vision of the future of computing. It’s called simply ‘The Machine’.

Image credit: HPE.

Image credit: HPE.

Meet Hewlett Packard Enterprise’s vision of the future of computing. It’s called simply ‘The Machine’. It’s a radical departure from today’s computer architecture that is being developed HPE’s R&D arm, Hewlett Packard Labs, to deal with the conflicting trajectories of computing hardware and software.

Jaap Suermondt, vice president, software and analytics at HPE, who is responsible for The Machine’s software stack and applications, told Computerworld Australia that it was designed to cater for the growing requirement to perform analytics on every-larger datasets at the same time as the annual doubling price/performance in line with Moore’s Law is coming to an end.

“By every indication we are seeing Moore’s Law starting to slow down,” he said. “On the one hand you have data that keeps growing exponentially, and on the other you have computing power that is not keeping up.”

Suermondt said that efforts to meet the ever-growing demands for processing power had centred on scaling out traditional computing architectures.

“What we have done for five or 10 years now has essentially been a work-around with various scale-out and clustering architectures: Putting more and more computers together and connecting them by copper wires,” he said. “This has led to an enormous increase in the energy consumption of data centres; that is becoming really problematic.”

Central memory and peripheral processors

The Machine abandons the traditional computer architecture of having a central processor with peripheral storage — solid-state memory, spinning disc and flash — and replacing it with a vast fabric of non-volatile semiconductor memory that simultaneously fulfills the functions of long term storage and traditional computer memory, and that makes that data it holds available to multiple processors.

“We think non-volatile memory is becoming an inevitability,” Suermondt said.“So we are going to put all the memory together on a big fabric and have the processors attached to that as peripherals. In this new architecture there is no longer any central processor, it's a peripheral processor.”

The Machine will have a fabric of non-volatile memory the size of which will dwarf that of current systems, Suermondt said. “The largest machines you can buy today are between 24 and 48 terabytes. That sounds like a stunning amount of memory but it is still pretty limited. Even our early prototype of The Machine will have about 100 terabytes of memory.”

Read more: Vodafone increase mobile data quotas

He added: “We want to provide hundreds of terabytes or even petabytes of memory all on the same fabric and addressable by any processor that hangs off that fabric. That is the essence of The Machine. In this new architecture any processor will be able to address any byte in The Machine and compute on it.”

The use of memory that could simultaneously provide long-term storage and make vast amounts of data accessible for computing would, Suermondt said, open up many new applications, and transform existing applications. IT security, he suggested would be one of the first to benefit.

Instant access to petabytes of data

“Today data can be examined in transit to detect malware and other anomalies that indicate security issues, but then the data passes into a vast repository that is only examined as part of a post-breach forensic investigation. With The Machine you will be able to hold months of data and continually analyse it in real time. That dramatically changes your capabilities and what you can detect.”

The architecture of The Machine would also, he said, enable disparate data sets to be compared, with suitable measures to ensure confidentially of sensitive data.

“There’s a big opportunity we have been talking about for a long time — precision medicine. We believe this technology is extremely promising in that area. You will be able to get the lifetime data from one patient and, in a privacy preserving way, compare them to hundreds of others and immediately find patients just like them.”

(Earlier this month, it was reported that IBM’s Watson system had correctly diagnosed a rare form of leukaemia that had baffled doctors: It was fed the patient’s genetic data, which it then compared with data from 20 million oncological studies.)

HPE has promised to unveil a prototype of The Machine before the end of 2016, but has given no indication of when a commercial product might be available. “It is not going to be pretty and is not going to be something we can ship but it will be awesome, it will be the real thing with a real software and show some pretty dramatic capabilities,” Suermondt said.

Read more: It’s a healthy market for data scientists at Orion

In parallel with developing the hardware of The Machine, HPE has been working on software for the new architecture for some time. Suermondt said it would run a version of Linux able to handle the very large memory fabric. He said it would present little change for software developers “Today a developer would dictate a data structure and allocate memory. In this new version they can make that memory persistent non-persistent, that’s’ the only change.”

Software will be open source

He said HPE had also developed a transactional database for use with The Machine, and software for presenting the results of data analysis graphically. “We’ve done that as an example.We are not planning to the transactional database business.

There are companies doing that already, and we think it will be very straightforward to port something like Hana to this new architecture.”

He added: “We have developed a set of graphing tools that takes advantage of the very large memory architecture, because we think the golden age of graphs is just about to start. We will open source that, and pretty much the whole machine software stack. We firmly believe in doing all that as the best way to seed the ecosystem and get rapid adoption.”

To help software developers write software for The Machine ahead of its availability, HPE is using its Superdome X, a high-end x86 machine that can have up to 24TB of memory, along with software to emulate The Machine.

As for the core technology of The Machine – non-volatile memory — Suermondt said there were a number of technologies for non-volatile memory approaching commercialisation.

HPE is putting its efforts into memristors, in partnership with SanDisk. “Intel has a technology called 3D XPoint,” Suermondt said. “There is something called phase change memory [being developed by IBM], which I believe is getting very close. There is yet another technology, spin transfer [spin transfer magnetoresistive random access memory] that other companies are investing in.”

Key to the commercial viability of The Machine will be the cost of this memory. Suermondt said he was not in a position to make any official comment, but he believed that likely demand for large volumes would result in affordable prices.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags serversHewlett-Packard EnterpriseHPE

More about Hewlett PackardHewlett Packard EnterpriseIntelLinux

Show Comments
[]