Will the Large Hadron Collider & IDT Finally Agree That The Answer is 42?

10-04-2015 | By Paul Whytock

Most of us have heard of the Large Hadron Collider (LHC) at Cern in Switzerland, the machine tasked with smashing particles of matter together to simulate the conditions in the moments after the Big Bang, generally regarded as the creation of the Universe.

However, for some of us, the legendary author of the excellent Hitch Hikers Guide To The Galaxy, Douglas Adams, already answered the “Ultimate Question of Life, The Universe, and Everything.”

In Adams’ world, a supercomputer, Deep Thought, churned away for 7½ million years to compute and check the answer and concluded it was 42. However, Deep Thought pointed out that the answer was meaningless because the beings who instructed it never actually knew what the question was.

The Large Hadron Collider is computationally demanding when it comes to analysis

So, back to the real yet somewhat science-fictional world of the LHC. In addition to being the largest single machine globally, the LHC is also one of the most computationally demanding when it comes to analysing its findings. The experiments in the Large Hadron Collider produce about 15 petabytes of data per year. In July 2012, it was revealed that the operators of the LHC collected about 200 petabytes of data from the 800 trillion collisions created looking for the Higgs boson.

Just to put that in perspective, a petabyte equals 1,000,000,000,000,000 bytes, so we are talking about biblical amounts of number crunching.

Low-latency RapidIO interconnect technology for efficient analysis of mass data

To help with this, the US electronics company Integrated Device Technology (IDT) has signed up to a three-year agreement with CERN to provide low-latency RapidIO interconnect technology. This will be employed to improve the efficient analysis of mass data from the LHC.

Teams from IDT and CERN will use the IDT technology to improve the quality and timeliness of this data collection and the initial analysis and reconstruction work at the experiments’ data farms and the CERN Data Centre.

The LHC produces millions of collisions every second and generates approximately one petabyte of data per second. The RapidIO technology provides a low-latency connection between clusters of computer processors. Used for 4G base stations, IDT’s low-latency RapidIO products can also enable real-time data analytics and data management.

IDT’s current RapidIO 20 Gbps interconnect products will be used in the first stage of the collaboration with an upgrade path to RapidIO 10xN 40 Gbps technology in the future as research at CERN progresses.

Because of the volume of real-time data CERN collects, current implementations use custom-built ASIC hardware. Using algorithms implemented in hardware, the data is sampled, and only 1% is selected for further analysis.

The collaboration is based on industry-standard IT form factor solutions suitable for deployment in HPC clusters and data centres. Engineers will use heterogeneous servers based on specifications from RapidIO.org that are targeted toward the Open Compute Project High-Performance Computing initiative that IDT co-chairs.

The computing platform used for the collaboration is based on commercially available RapidIO-enabled 1U heterogeneous servers capable of supporting industry-standard servers, GPU, FPGA and low-power 64-bit SoCs, as well as top-of-rack RapidIO switches available from Prodrive Technologies.

paul-whytock.jpg

By Paul Whytock

Paul Whytock is Technology Correspondent for Electropages. He has reported extensively on the electronics industry in Europe, the United States and the Far East for over thirty years. Prior to entering journalism, he worked as a design engineer with Ford Motor Company at locations in England, Germany, Holland and Belgium.