GENEVA--The CERN Computer Center is the number-crunching hub that powers the physics research lab's quest to discover the nature of the universe.
A formidable 8,000 servers housing 40,000 Intel processor cores provide the grunt to help crack the petabytes of data spewed out from Click here to see all the images.), based here. Editors' note: This story was originally published on Silicon.com as a photo gallery.
About half of these cores will be used to deal with data from the, which will generate about 15 petabytes of data by colliding protons with protons.
The computer center will provide only about 20 percent of the processing power used to examine the LHC data, with the rest coming from the LHC Computing Grid, a dedicated network of more than 100,000 processors.
Scientists hope the LHC will offer a "glimpse" at the Higgs Boson, a particle thought to give mass to the universe.
The LHC will produce up to 600 million particle collisions per second. To store the huge amount of data the LHC produces, the center houses 8 petabytes of hard disks and 18 petabytes of magnetic tapes. This will increase to 16 petabytes of disc and 30 petabytes of tape by the end of the year.
Even this is insufficient to store the vast amounts of the raw data produced by the LHC, so its four detectors--which each look for different particles and energy signatures--have built-in electronics and smaller computer centers that analyze petabytes of data per second they collect and that throw away the bulk of the information not of interest to the physicists.
The data that's left is sent on to the computer center and its racks of servers.
"A lot of processors are devoted to data processing for physics. We are collecting a tremendous amount of data from the collision points," said Jean Michel Jouanigot, head of network services at CERN.
Once the data arrives at the center it is immediately stored and reprocessed before being made available to 7,000 physicists in 33 countries via the LHC grid.
The grid is linked to the center through dedicated 10-gigabit-per-second connections. It can handle about 50,000 users at once, sharing out bandwidth and processing power between scientists.
"The grid is a worldwide collaboration through many hundreds of sites and will get information through very powerful networks," Jouanigot said.
CERN serves as an Internet exchange point and is one of the oldest in Europe.
Within the computing center itself, the data exchange is handled by 1,500 10-gigabit ports, while information flow within CERN's various sites is handled by 70,000 1-gigabit ports.
Data is stored on tapes as soon as it comes to the computing center. Whenever one of scientists is plugged into the LHC Computing Grid requests data, it is retrieved by a robot within the StorageTek vault.
The center has four robots, each holding about 20,000 tapes, and it's planning to fit in two more.
Using existing tape technology, the room would be filled up within 10 years. However, Jouanigot said, the center is constantly upgrading to tapes with higher data density, adding that each tape now stores about 750GB compared to about 200GB two years ago.
Jouanigot said that the center refreshes its hardware about every three to four years. All the hardware in the computing center uses off-the-shelf components, and the servers run a customized version of Red Hat Linux.
The LHC is fed with protons by a series of particle accelerators that increase the speed and energy of the particles. The particles are then are fed into the LHC's 17-mile ring and accelerated to 99.9 per cent the speed of light.
Each beam that will collide in the LHC consists of up to 100 billion protons, and the center's 39 consoles allow operators to manage the beams' passage around the accelerators and monitor their cooling. The facility's cryogenic cooling system brings the collider's temperature to just above absolute zero to allow the superconducting magnets that drive the beams to work.
But for the time being, that cooling system has been switched off. The LHC is being returned to room temperature to allow repairs to be carried out on a fault. It is expected to.
Nick Heath of Silicon.com reported from London.