Server virtualization, cloud software come to CERN

The Hadron Collider team looks to improve CPU utilisation and computing resource delivery to scientists

CERN, the European particle physics organization that runs the Large Hadron Collider, is embracing server virtualization and cloud computing technology to improve CPU utilization and the delivery of computing resources to scientists around the world.

CERN, which uses Red Hat's version of the Xen hypervisor as well as Microsoft's Hyper-V, has just installed private cloud software from Platform Computing to automate the process of managing virtual infrastructure. Virtualization and advanced management tools will help deliver compute cycles more efficiently to 10,000 researchers from 85 countries.

Bots, bombs and weird science: The wackiest stories of 2009

"It will greatly facilitate our ability to deliver resources to the users," says Tony Cass, group leader for fabric infrastructure and operations at CERN. "Users always want more CPU cycles so they can evaluate one more scenario or try out one more thing. The more cycles we can squeeze out of the fixed amount of resources we have, the more physics they can do."

Platform has billed its Platform ISF software as a "private cloud" tool that aggregates servers, storage, networking tools and hypervisors to create a shared pool of physical and virtual resources. An announcement from Platform Computing credits the software with helping CERN build "the world's largest cloud computing environment for scientific collaboration."

Cass says he doesn't actually view the project as a cloud initiative, but says it improves the institute's grid network by making better use of virtual resources. For example, Platform ISF makes sure the proper amount of resources is dedicated to different applications, such as different versions of an operating system. Platform also determines which virtual machines are placed on particular pieces of hardware, making sure that enough network bandwidth is allocated to each VM, and ensures that VMs are taken offline once they are no longer needed.

Platform also automates the process of live migration, allowing VMs to move from one physical box to another without being turned off, Cass says.

So far, CERN is running a few hundred VMs on the Intel-based x86 servers that make up its batch environment, which serves the scientific community. CERN could potentially have 60,000 or moreVMs running batch jobs in the future, however.Cass wants to aggressively move batch jobs to VMs over the next year, but the rate of adoption depends partly upon user acceptance.

With the Large Hadron Collider having just recently come online, "people will be wary of too many changes in the computing facilities we're providing, but if we can demonstrate that virtualization is perfectly safe there's no reason we wouldn't migrate most of the batch [jobs to VMs] by the end of next year," he says.

Users won't have a self-service interface to the extent of what is available on Amazon's Elastic Compute Cloud, but will be able to choose between a few preconfigured software stacks including an operating system, compiler and other software.CERN, which also uses a previous product called Platform LSF, a grid workload management software, hopes its latest undertaking will improve system utilization by about 15 per cent or 20 per cent.

CERN has to process and distribute more than 15 petabytes of data to researchers per year, all in near real time, and has 60,000 CPU cores to manage the load. Ultimately, Cass says CERN may use Platform ISF, the new product, to manage "every single machine we've got."

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags Server Virtualizationlarge hadron colliderCERN

More about Amazon Web ServicesCERNIntelMicrosoftPlatform ComputingRed Hat

Show Comments
[]