Subscribe to Newsletter

Helix Nebula - the Science Cloud to address CERN LHC huge data issues

Helix Nebula - the Science Cloud to address CERN LHC huge data issues

CERN, the European laboratory for particle physics, enables 13 000 physicians around the work to understand the universe, how it works, and what it is made of, using one of the most complex and largest pieces of scientific equipment in the world.
CERN's flagship experiment is the Large Hadron Collider, a 27 km underground ring between the borders of France and Switzerland. It fires around two beams of particles at just below the speed of light which it collides at four different locations around the ring. Here massive detectors, the size of Notre Dame are positioned to gather the data produced at 1 petabyte a second. This then needs to be processed, analysed and stored.

This creates a huge data issue.

CERN has been investigating the use of public cloud over the past few years to expand its resources in a transparent fashion. In 2016 CERN ran a public open tender won by T-System Open Telekom Cloud (OTC), of which Huawei is part of, to deliver compute capacity to the LHC.

In this interview, Tim Bell, Huawei and Compute and Monitoring Group Leader at CERN, explains that they are now looking at two areas for future collaboration. The first using the extreme computing challenges of the LHC in an industry-research partnership called CERN OpenLab. The other based around Helix Nebula - the Science Cloud, where they are exploring what they would need as public cloud for science along with another number of science labs.

The video interview is available on HNSciCloud Youtube Channel here https://youtu.be/kuFF52QwzOg

Interview courtesy of Huawei

 

Who's behind HNSciCloud?

HNSciCloud Pre-Commercial Procurement is driven by ten leading research organisations from across Europe.