Dark matter is just protons

Artificial dark matter in the laboratory?

The big bang showed us how to do it. He generated so much of the mysterious dark matter that it now makes up about eighty percent of the total matter of the universe. With modern particle accelerators such as the Large Hadron Collider (LHC) at CERN near Geneva, particle physicists create artificial conditions in small areas of space that existed only a tiny fraction of a second after the Big Bang. Shouldn't it then also be possible to imitate the Big Bang and artificially produce particles of dark matter?

The situation is tense. Supersymmetry, superstrings, additional spatial dimensions, all well and good. But the elementary particle physicists have not seen any of this in their giant particle accelerators. Ever more precisely, they have confirmed the Standard Model of elementary particles, and the desperate attempts to discover the physics beyond this seemingly almighty bulwark have all failed (see article “Ingredients for a Universe”). On the other hand, increasingly more precise measurements mean that the properties of new hypothetical particles and forces are more and more restricted - should they really exist. However, there is something for sure, because 80 percent of the matter in space consists of dark matter, which simply does not fit into the standard model. The associated particles could simply be too heavy, so that they could not previously be generated at our accelerator facilities. An accelerator with more energy was needed, because according to Albert Einstein's theory of relativity, more energy also means greater particle mass that can be produced!

The LHC on the trail of dark matter

View into the LHC tunnel

The Large Hadron Collider (LHC) at CERN near Geneva should enable us to finally take the decisive step beyond the limits of the standard model (see article under "LHC"). Since 2010, protons have been accelerated to energies of 3.5 TeV (teraelectron volts, one trillion electron volts). In a few years this energy should be doubled again, so it will be 7 TeV. That is about 7000 times the energy equivalent of the proton mass and 7 times the world record set before 2010. The protons are kept in a so-called storage ring on a frontal opposite course, so that a total of 14 TeV of energy is available in the event of a collision. However, only part of this is actually implemented in a scattering process, because the protons do not collide as a whole. It is only the components of the protons, the quarks and gluons, that collide. Nevertheless, this is sufficient to generate particles with a mass of a few thousand proton masses in sufficient numbers. According to our theoretical expectations, that should be enough for the particles of dark matter.

But it's not that simple. This can be seen from the fact that the two multi-purpose detectors ATLAS and CMS, which are being set up at the LHC to research the TeV mass scale, are real monsters in which one could hide several single-family houses. They are replete with the latest detector technology and fast electronics with many millions of individual channels and are controlled by computer networks with thousands of computer nodes. The reason for this extreme effort is that one generally has to wait for a large number of protons to collide before something interesting, such as a new heavy particle, is produced by chance.

The four LHC experiments

In order not to have to wait too long, you have to generate as many collisions as possible in the shortest possible time. At the LHC, two packets of protons cross each other 40 million times a second in the center of the two multi-purpose detectors, with typically two dozen proton collisions taking place and overlapping each other. This creates a total of several thousand particles that the detectors have to deal with - quickly, 40 million times a second. However, you can only read out and save a small fraction of this enormous flood of information. Therefore, with the help of fast electronics and large computer networks, the detector must immediately select the interesting collisions and forward them for storage and discard the others.

It can take hours or more for a certain interesting, but infrequent, process beyond the Standard Model to occur even once. A detector must therefore recognize such special events with a high degree of probability and forward the associated information. The proverbial search for a needle in a haystack is child's play against this task. But how is that supposed to happen if we don't even know what exactly we're looking for in the dark matter hunt?

Evidence of Dark Matter?

One possibility is offered by the fact that the dark matter particles they are looking for feel only very weak force effects and are therefore thrown out of the detector somewhere laterally without leaving the slightest trace. This severely disturbs the energy balance measured in the detector - there is a lack of energy. This is the most important fingerprint of the desired events, which can be combined with a few others, such as the measurement of high-energy particle bundles in the detector, so-called jets. In this way one hopes to recognize the existence of the new particles very quickly.

Cascade decay of supersymmetric particles

The next step will be even more tricky. There is something new, but what is it exactly? From now on one needs a sufficiently large number of exotic events with new particles and a set of theoretical models that can be compared with the observations. The models with supersymmetrical particles currently favored by particle physicists will be examined particularly closely, in which a neutralino is usually the lightest and therefore stable supersymmetrical particle (see article “The particle doubler: supersymmetry”). This neutralino is therefore an essential component of dark matter. It is expected that a whole series of supersymmetrical particles can be generated in the collisions, but that they disintegrate again almost immediately and form both ordinary particles and lighter supersymmetrical particles, right up to the lightest neutralino.

The ordinary particles generated in such cascades are registered by the detector; only the lightest supersymmetric particle escapes unnoticed. This provides the crucial information about the supersymmetric particles involved in the decay cascades, and with a few ingenious tricks you can actually determine the masses of the supersymmetric particles produced with some degree of accuracy. In addition, from the frequency and type of decays measured, one can also pin down the theoretical models that fit the observations, at least for those that are not unnecessarily complicated and impress with their elegance. Within such a firmly lashed model, one then knows in principle all properties, in particular all masses, types of decay and interaction rates of all supersymmetric particles.

In the last and decisive step, with this full knowledge of the properties of the new particles, one can finally calculate whether the correct amount of dark matter would have been generated under the conditions that prevailed shortly after the Big Bang and would have survived to this day. If the picture comes out consistently, we can be pretty sure that we are on the right track.

Reinforcement by a linear collider

As already indicated, it is still comparatively easy to prove the existence of new - for example supersymmetric particles - at the LHC. Precise quantitative measurements, such as determining the mass of the new particles, are much more difficult. Therefore, the particle physicists hope in the medium term for help from another accelerator, which is technologically extremely sophisticated and is currently in the planning phase, a so-called linear collider. Here electrons and their antiparticles (positrons) with energies of at least 500 GeV (that is about a million times the energy equivalent of the electron mass) are to be shot at each other. This energy is about five times as large as the previous world record for electrons. The events that are generated at such accelerators are relatively easy and clean to interpret, and underground processes are much rarer than at the LHC. Therefore, the properties of many new particles that have already been discovered at the LHC can be measured directly and very precisely on this machine, even without having to use any theoretical models as a basis.

But is the dark matter mystery solved with this? For example, are the candidates for dark matter particles generated at the accelerators really stable? We only see that they live long enough not to disintegrate on their way out of the detector. But do they survive 14 billion years, the age of the universe? Or does dark matter even consist of several components, of which we have just found one?

Strong together

These questions can only be answered satisfactorily if everyone pulls together. The experiments at large particle accelerators provide a complete set of new particles with all properties, at least within the framework of the theoretical models that match the observations. However, you cannot unequivocally establish the connection to dark matter. The experiments deep underground, which look for collisions between the dark matter particles and atomic nuclei, measure the properties of the dark matter particles directly (see article “On the trail of dark matter: sit down and wait”). But they cannot say anything about possible heavier, unstable particles beyond the standard model. You only see a tiny section of the large picture. This also applies to the astrophysical experiments to search for destructive dark matter in space (see article "Dark matter: Searching for traces in space"). Only the combination of the various experimental approaches can provide us with a conclusive and inherently consistent picture of dark matter and the associated elementary particle physics.