Physicists believe that the LHC could play an important role in solving unanswered questions about the laws of physics. However, for this to occur, the practicalities of storing the volume of data that it produces must be addressed.
GlobalData’s technical writer Ellen Daniel heard from the European Organization for Nuclear Research (CERN) on the importance of networks in enabling cutting-edge discoveries.
Inside the LHC, which is made up of a ring of superconductors, particles travelling almost at the speed of light collide with each other. Scientists can observe up to 1.7 billion proton-proton collisions per second, producing a data volume of more than 7.5 terabytes per second, which means that the capacity of CERN’s data centres must be prepared for large volumes of data.
Tony Cass, Communications Systems group leader in the CERN IT department explains just how important networking is. He tells GlobalData: “After electricity, networking is the most important thing at CERN. Physicists obviously need it to move the data from the experiments to the Data Centre where it is processed, and then outside to our partners around the world. Then we have 2,500 staff who need it for all our administrative applications, to access the databases and we have to serve 13,000 users who regularly use it for email, for the web, for everything. The LHC experiments produce about 90 petabytes of data per year, and an additional 25 petabytes of data are produced per year for data from other (non-LHC) experiments at CERN.
“Archiving the vast quantities of data is an essential function at CERN. At the beginning of 2019, 330 PB (330 million gigabytes) of data were permanently archived in the CERN Data Centre, an equivalent of over 2000 years of non-stop HD video recording.”
Not only does CERN’s future research have the potential to improve human understanding of what makes up our universe, but it means that going forward, the network will have to meet new challenges.
Vincent Ducret, network engineer in the CERN IT department, tells GlobalData: “CERN needed to increase the capacity of its data centres, including servers, storage, and networking, to fuel the next wave of discovery. The goal wasn’t only more capacity. We needed the network path to be more dynamic. We wanted the ability to use technologies like VXLAN for greater flexibility in connecting servers over the network.
“Juniper was selected as a supplier in this context and in view of preparing the network for the LHC Run 3 and also for the High Luminosity LHC to be commissioned in 2024. The latter will increase the precision of the ATLAS and CMS detectors. The requirements for data and computing will grow dramatically, with rates expected to be around 500 petabytes per year.”
For more information
Visit GlobalData’s Verdict