Counting bosons to improve data analysis at the LHC

Method made at DESY improves luminosity measurement

Two scientists from DESY have set out to make LHC collision data much more precise. In the course of one PhD thesis – that of David Walter – the PhD student and his colleague Andreas Meyer have worked out a system to complement the existing measurements of the collision rate in the Large Hadron Collider using a well-known and much-produced particle, the Z boson.

The data analysis rule of thumb is: the more collisions the better. To search for new physics, the aim of the Large Hadron Collider (LHC) experiments at CERN is to collect and investigate as many proton-proton collisions as possible to increase the chances that undiscovered particles are created and found. However, unexpected results might already be present in current data and show up as small deviations from the known processes. To study this case, physicists perform measurements of the probability that a known process appears with the best possible precision,called the cross section. Because some processes are known to appear very rarely (i.e. to have a small cross section), more collisions are needed for precise measurements of those as well. The cross section is determined by measuring the number of times a process occurs within a sample of data of a given size, which in turn is called integrated luminosity. This means that accurate cross-section measurements require not only a high number of collisions but also a precise knowledge of the integrated luminosity.

Traditionally, scientists measure the luminosity with the help of van der Meer scans, a technology that can pinpoint very precisely the properties of the colliding beams and calculate the number of collisions per unit of time and area accordingly.

At the LHC, bunches of trillions of protons cross every 25 nanoseconds, which is 40 million times per second. Typically, each time the bunches cross not only a single proton-proton collision happens, but multiple ones simultaneously, known as pileup. This increases the luminosity recorded per second, the so-called instantaneous luminosity. In 2017, the CMS experiment took data with an average pileup of around 30 simultaneous collisions, which means an average of 1.2 billion collisions per second. In future LHC runs, the pileup is expected to increase by a factor of 6 or 7. This confronts physicists with a problem: with higher pileup it becomes more difficult to precisely measure the instantaneous luminosity, i.e., to know how many collisions happened.

Seeking a solution, DESY scientists David Walter and Andreas Meyer have developed a novel approach to calculate the luminosity based on a particle well understood from other precision. The Z boson is a particle that occurs in large quantities and decays right away. In about 3% of the cases, often enough for this method, it decays to two muons, the heavier siblings of electrons and the particle that CMS specialises in. This means that the number of Z bosons can be extremely well measured. „In standard running nowadays, the LHC produces about 500 Z boson events decaying into muons per minute,” Meyer explains. This large amount of data makes it possible to use Z bosons as a boost to the precise knowledge of the luminosity and thus in turn the data analysis.

The study authored by the DESY scientists has been compared to conventional measurements and shows good agreement, highlighting that a better precision can be achieved using Z bosons. „We have also seen that the method agrees with conventional luminosity measurements even for the highest values of instantaneous luminosity recorded by CMS,“ Walter explains. „The approach will therefore be suitable also in the future, when the experiments will experience unprecedented pileup levels.“

Original publication "Luminosity determination using Z boson production at the CMS experiment"