Cloud computing goes open source

New cloud computing test beds may give business IT the confidence of on-demand services -- and improve open source

With more and more companies jumping into the cloud computing fray, this week's joint announcement by Hewlett-Packard, Intel, and Yahoo seems like a yawner. But it isn't.

The newest cloud computing initiative will rely heavily on open source technology. And that means participants will have to share their findings and at least some of their code with the open source community. "That's what sets this announcement somewhat apart," says Gartner analyst David Mitchell Smith.

Intel, Yahoo, and HP are forming the Cloud Computing Test Bed, which they describe as a global, multidatacenter, open source effort designed to promote research on software, datacenter management, and hardware for large-scale, Internet-hosted computing.

The companies will host six "centers of excellence," each of which will have a cloud computing infrastructure mostly based on HP hardware and Intel processors. The centers will have 1,000 to 4,000 processor cores and are expected to be up and running later this year for selected researchers from around the globe.

HP's expertise in systems management and Yahoo's work in parallel computing make it likely that the initiative will add significant capabilities to the cloud, says Smith.

Also noteworthy is the participation of the Singapore development authority. The city-state has a history of interest in innovative IT projects, says analyst Dennis Byron of IT Investment Research.

More than a decade ago, Singapore worked with Data General on a project that was remarkably similar to cloud computing. The company even used a cloud-like logo for the efforts, although utility computing was the name in those days, Byron recalls.

Should Singapore actually build a cloud for government use, the technology will have that much more credibility with IT execs who are still leery of putting critical network resources outside of their direct control.

Pigs in the cloud

As my colleague Neil McAllister pointed out recently, most existing programming languages were designed in an era when processing resources were scarce. But a major premise of cloud computing is that the massive increases in computing power (and bandwidth) now available make it possible to distribute resources across the Internet.

So it's significant that Yahoo will be contributing Pig, a parallel programming language developed by the company. Yahoo certainly has some relevant experience in massive computing as well; it is providing Carnegie Mellon University its M45 supercomputing cluster. The cluster uses 4,000 processors capable of performing 27 teraflops and sporting 3TB of memory and 1.5PB (that's petabytes), or 1,536TB, of storage. It runs the latest version of Hadoop (to which Yahoo is one of the principal contributors) and other open source software, including Pig.

Yahoo has also teamed with Computational Research Laboratories (CRL), a lab run by India's Tata Group, to offer supercomputing facilities free to academic institutions in India that are researching large-scale computing, particularly around Apache Hadoop, which will also be used in the HP-Intel-Yahoo cloud computing initiative.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about ApacheBig BlueCarnegie Mellon University AustraliaData GeneralGartnerGoogleHewlett-Packard AustraliaHPIBM AustraliaIntelMassachusetts Institute of TechnologyMellonMicrosoftStanford UniversityTataTataYahoo

Show Comments
[]