HP's radical new Machine could start computing by 2016

Martin Fink, the head of HP Labs, offered a road map for the company's new Machine computer architecture

HP lab director Martin Fink, talking about the Machine, Software Freedom Law Center Conference, Columbia University

HP lab director Martin Fink, talking about the Machine, Software Freedom Law Center Conference, Columbia University

Hewlett-Packard's efforts to usher in an entirely new computer architecture, one potentially much faster and simpler, may bear fruit by the end of 2016, when the company's lab expects to have the first prototype machine based on its design.

Should HP's Machine architecture prove successful, everyone in the IT business, from computer scientists to system administrators, may have to rethink their jobs, judging by a talk HP Labs Director Martin Fink gave at the Software Freedom Law Center's 10th anniversary conference Friday in New York.

HP first unveiled its plans for the Machine at its Discover user conference in June, anticipating that it could offer commercially available computers based on the design within 10 years. Fink said that the first prototype machine could be operating by 2016, giving the company a few extra years to work out bugs.

The Machine rethinks the Von Neumann computer architecture that has been dominant since the birth of computing, in which a computer has a processor, working memory and storage. In order to run a program, the processor loads the instructions and data from the storage into memory to perform the operations, then copies the results back to disk for permanent storage, if necessary.

Because the fabrication techniques for producing today's working memory -- RAM -- are reaching their limits, the industry will have to move to another form of memory. There are a number of experimental designs for next-generation memory being developed and HP is working on its own version, called memristor, which the Machine is based on.

All of the new memory designs share the common characteristic of being persistent, so if they lose power, they can retain their contents, unlike today's RAM. In effect, they can take the place of traditional storage mechanisms, such as hard drives or solid-state disks. This means computers can operate directly on the data itself, on the memristors in HP's case, instead of shuttling the data between the working memory and storage.

Such a seemingly simple change in the architecture will nonetheless have a series of radical "cascading" effects on how computation is done, Fink explained.

For starters, the approach will means computers, in theory, could be much more powerful than they are today.

HP has stated that a Machine-like computer could offer six times the performance requiring 80 times less power. Fink expects that the first prototype that HP is working on will have 150 compute nodes with 157 petabytes of addressable memory.

Such a machine would require an entirely new operating system, Fink said. Most of the work an OS involves copying data back and forth between memory and disk. The Machine eliminates the notion of data reads or writes.

HP Labs is working on what Fink called a "clean sheet" OS, called Carbon. The company will publish open-source code as early as next year. It is also modifying a version of Linux, tentatively called "Linux++," to help users transition from the Von Neumann style of computing.

Applications may need to be rethought as well under this new style of computing. In particular, relational databases will seem archaic due to their elaborate mechanisms of indexing and flushing data to disks once a transaction is completed. In the new design, "the notion of secondary persistence goes away," Fink said.

Instead, database operations will be closer to what is known as graph databases, in which a program will look for ways of optimizing all the data available to the particular problem at hand. Fink noted that Facebook has built some expertise in this approach, given that it possesses and routinely processes a dataset of interrelated information of its billion users

Instead of relational databases, Hadoop may be a more natural fit for this new architecture, given that it doesn't need to impose any order on the data itself, Fink said.

Another advantage that the Machine architecture will offer is simplicity, Fink said. Today, the average system may have between nine and 11 layers of data storage, from the super-fast L1 caches to the slow disk drives. Each layer is a tradeoff between persistence and speed. This hierarchy leads to a lot of design complexity, most all of which can be eliminated by the flatter design of using a single, persistent, fast memory.

"Our goal with the machine is to eliminate the hierarchy," Fink said.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags serversHewlett-Packardhardware systemspopular science

More about FacebookFreedomHPIDGLinuxNews

Show Comments
[]