Friday, May 31, 2013

Pileup

In particle physics, large quantities of data is a mixed blessing. In the most obvious case, petabytes of data means that you can get a statistically significant result, but it also means that analysis is complicated and time-consuming. But more than that, with accelerators like the Large Hadron Collider at CERN, not only do experiments collect tons of data, it can arrive from many events at once, which means that even with the most advanced trackers, particle tracks overlap. For instance, in the LHC, bunches of protons circulate in the accelerator at nearly the speed of light until opposing bunches are directed into a collision at the center of one of the various detectors. At relatively low energies, most of the protons in the bunch don't actually collide, and you get just one or two proton-proton collisions. When they ramp up the energy, however, they end up with dozens of nearly simultaneous proton-proton collisions and interactions, all of which send showers of particle debris into the surrounding detector. Pileup occurs when two or more particles hit the same part of the detector at very nearly the same time, and it somewhat complicates the event analysis.

Luckily, experiments like the LHC have highly segmented detectors that are capable of tracking individual particles over fairly long distances, and it turns out that by a process called event reconstruction, it is possible to extrapolate tracks back to the location of the collision. Since not all of the proton-proton collisions are in exactly the same location in the beamline, such an extrapolation can match each particle to the collision in which it was produced. Based on that, analysts can distinguish the individual collisions and conduct analysis for each collision separately. This is particularly useful when there's one interesting collision and 30+ boring ones, as it allows easy removal of much of the background.

Other experiments aren't quite so lucky as the LHC, though. One experiment near and dear to me is Muon g-2 (pronounced gee minus two), an experiment at Fermilab that will be measuring the magnetic moment of the muon with unprecedented precision. Because it has a slightly smaller budget than the giant accelerators elsewhere, its detectors primarily consist of calorimeters, which capture the decay products of the muons (positrons, in the case of positively-charged muons) and measure their energies. Since most calorimeter stations have no tracking capability, they are vulnerable to pileup, in which two low-energy positrons hit the same calorimeter at almost exactly the same time, which may erroneously be interpreted as a single higher-energy event. This issue wreaks all sorts of havoc with the final experimental analysis, so it's a major goal for the experiment to reduce the systematic uncertainty caused by pileup in the final result. This is being done in a variety of ways.

The most obvious way of reducing pileup systematic uncertainties is by increasing experimental resolution. This can be done both spatially and temporally. Spacial resolution is increased by increasing the segmentation of each calorimeter. In the most recent previous iteration of the experiment at Brookhaven National Lab, the calorimeters were divided into a 5 by 7 grid of sub-detectors. Each of these is capable of detecting a hit and measuring the energy a particle deposits within it, so some pileup can be identified by the spacial separation of two particles. In the upcoming iteration, the detectors remain the same size, but will be segmented in a 6 by 9 grid, which further reduces the amount of pileup by increasing the spacial resolution of the detector. There's a limit to how much you can segment the detector, though; the material has a very definite Molière radius, which describes how much of the material is lit up by a single high-energy positron collision. If you segment the detector so that the individual components are smaller than the Molière radius, then you haven't actually gained anything, as each event will light up multiple segments, rather than allowing you any higher resolution.
Another technique is to increase temporal resolution, which improves physicists' ability to distinguish events that occur at nearly the same time. Improving the rate at which data is collected goes a long way towards reducing the systematic error caused by pileup.

I previously mentioned that tracking is a powerful tool for identifying and analyzing pileup. Because of this, Muon g-2 is also planning to use tracking chambers in front of two of their 24 calorimeters to determine the trajectories of positrons before they hit the calorimeters. Since the whole experiment is conducted in a magnetic field, lower-energy positrons will follow curvier paths than high-energy ones. The plan here is to get information about pileup from the two stations with trackers, then apply that to the other calorimeters in the experiment.

Finally, analytical techniques are being developed to help subtract out the pileup events. Simulations can help with that, as can large-scale analysis of the data set as a whole. There are also some analysis techniques that work not by identifying and then analyzing individual events, but instead by integrating the calorimeter signal over time in order to obtain a similar result. It's a really interesting area, with (as far as I can tell) no prescribed solution, which means that people are actively working on it and coming up with novel analysis techniques.

No comments:

Post a Comment