Pages

Powered by Blogger.

How CERN and the Large Hadron Collider are leveraging cloud technology


In a 2010 panel discussion podcast, Interarbor Solution’s principal analyst Dana Gardner asked Tony Cass, group leader for Fabric Infrastructure and Operations at CERN; Steve Conway, Vice President in the High Performance Computing Group at IDC; and Randy Clark, Chief Marketing Officer at Platform Computing, their thoughts on the evolutionary direction of CERN, the European Organization for Nuclear Research based in Geneva, Switzerland.

Last month, it seems, that question was answered.

In early March 2012, CERN announced it would join the Helix Nebula Initiative — nearly doubling its data processing power, reports Tech Republic’s Nick Heath. The Helix Nebula initiative is a pilot program of cloud computing providers and research organizations seeking to kick-start the European cloud computing industry by carrying-out scientific research in the cloud.

The initiative will lead and coordinate the European scientific communities of interest through a two-year pilot phase during which procurement processes and governance issues for a framework of public/private partnership will be appraised.

“CERN already supplements its processing power via a network of 150 computing centres, known as the Worldwide LHC Computing Grid (WLCG), which store and analyses its research data,” said Heath. “The WCLG puts some 150,000 processors at CERN’s disposal but the research institute is examining whether it could double that number by turning to cloud computing.”

Bob Jones is CERN’s head of openlab, the public-private partnership that helps CERN identify new information technologies that could benefit the lab, Heath said. Jones said that CERN’s demand for computing power will soon exceed supply, necessitating a move to the cloud.

CERN is the ideal organization to push the limits of cloud computing, dealing with fantastically large data sets, massive throughput requirements, a global workforce, finite budgets, and an emphasis on standards and openness, Interarbor Solutions’s Gardner explained.

Additionally, the computing power and storage promised by the cloud could help researchers analyze LHC data faster, Jones said.

It’s nearly impossible to talk about immense, widely distributed data like the sets generated by CERN and the LHC without mentioning how and with what tools that data will be analyzed.
Linux Journal explains in great detail how CERN used open-source software solutions to analyze data — and how they discovered and implemented Hadoop.

Matt Massie, too, tackles the topic, detailing how the University of Nebraska-Lincoln uses Hadoop to process the results of high-energy physics experiments, including data from CERN and the LHC.

0 comments:

Post a Comment