Big data vital for Large Hadron Collider project: CTO

European Centre for Nuclear Research (CERN) Openlab’s Sverre Jarp says the Collider generated 30 terabytes of data in 2012

The Large Hadron Collider

The Large Hadron Collider

When you’re trying to learn more about the universe with the Large Hadron Collider (LHC), which generated 30 terabytes of data this year, using big data technology is vital for information analysis, says CTO Sverre Jarp.

Speaking at the Big Data Warehousing and Business Intelligence 2012 conference in Sydney this week, European Centre For Nuclear Research (CERN) Openlab’s Jarp told delegates that physics researchers need to measure electrons and other elementary particles inside the LHC at Geneva, Switzerland.

“These particles fly at practically the speed of light in the LHC so you need several metres in order to study them,” he said. “When these collide, they give tremendous energy to the secondary particles that come out.”

Register or Login to continue

This article is only available for subscribers. Sign up now for free and get free access to premium content from ARN, CIO, CSO, CMO, Computerworld, and PC World.

[[ message ]]
Or
[[ message ]]

Tags business intelligencelarge hadron colliderCERN

More about APACCERNFacebookGalaxyGartnerGenevaIDGIDG CommunicationsIDG CommunicationsIDG CommunicationsSwitzerland

Show Comments
[]