Hundreds of millions of particle collisions take place every second, at the heart of LHC's detectors. The sensors generate about one petabyte of data every second, an amount no computing system in the world could be able to store if it was generated for any prolonged period.
Most of the data is discarded quickly, as sophisticated systems select what could be of interest for the scientists and filter out the clutter. Then, tens of thousands of processor cores go even further and choose just one percent of the remaining events - information which then gets stored and is later analyzed by physicists.
The datacenter can save 6GB of data per second at the peak rate of the LHC. However, this gigantic machine doesn't run 24/7. "We're expecting about 30 petabytes per year of LHC run two - that would represent something like 250 years of high-definition video," Frédéric Hemmer, IT department head, told ZDNet.