A potential portal into the invisible sector

I thought about the topic of my very first blog for a while. That is because I would rather not start my blog with an arbitrary status report from cosmology or particle physics or a random toy problem that I find interesting. To make it a more special occasion, I will enthuse over the news we recently heard from the LHC (Large Hadron Collider). If any of this is beyond mere statistical fluctuation, then we are at the dawn of a new phase of fundamental understanding of our Universe. And, I really mean it.

For the expert audience, the upshot is that the ATLAS and CMS experiments at the LHC have seen excess diphoton events in the dataset collected during the second run at 13 TeV. But before I delve into the exciting part, let me take a step back and go through some basics.

A particle accelerator is a device that accelerates subatomic particles such as protons, electrons or their anti-particles to relativistic speeds, i.e., close to the speed of light. This can be achieved in many ways using radio frequency pulses or magnetic field generated by superconductors such as what is being done at the LHC. After the particles are accelerated, they are allowed to collide in particle detectors. Given the large initial phase space of the colliding particles (thanks to the large kinetic energies), a zoo of final states can be produced. Then the particle detectors measure the momenta, energy and charge of the resulting secondaries in order to identify them as well as any other resonances (unstable particles) generated during the interaction.

The idea behind such experiments is then to probe how fundamental particles interact. Thanks to efforts in particle physics in the last century, we have effective theories that explain and predict these interactions at low energies. However we currently do not have a so-called UV complete theory that is valid at arbitrarily large energies. Our most-complete theory is the Standard Model, which can separately explain the electro-weak and strong interactions as well as generation of mass for otherwise massless particles. However it has a list of shortcomings which will be an endless source of material for my future blogs.

We therefore would like to collide particles at the highest possible energy as an experimental probe of how Nature behaves at this uncharted territory. The cool thing is that people have been waiting for a \(\sim\) TeV scale accelerator for decades. We are the lucky generation that has witnessed the commissioning of this beautiful machine and its first major result, the discovery of the Higgs boson back in 2012.

By the way, TeV is an energy scale where the Standard Model must break. At even higher energies, the Standard Model predicts a quantum mechanical correction to the Higgs mass, which is larger than the mass itself! This requires fine tuning the free parameters of the model and bothers the physicists. For example imagine generating random universes using the Standard Model. Because of the fine tunings, it would be extremely unlikely to get a universe like ours. Reversing the argument, given that we live in this Universe, Standard Model is unlikely to be a good description of our Universe much above the TeV energy scale.

Anyways, back to the 750 GeV diphoton excess. The data analysis that I will discuss is possibly the cleanest experimental channel available to ATLAS and CMS. The idea is to look for only two photons, whose particle identification is relatively easy, as the result of the proton-proton collision. The Standard Model robustly predicts the number of such diphoton events when two protons collide. Therefore any diphoton spectrum, i.e., the number of diphoton events as a function of photon energy, above that predicted by the Standard Model will probably be the subject to a future Nobel prize.

Below, I show the money plot of the ATLAS collaboration from the Moriond Conference summarizing their results. As it is a very busy plot, let us walk through it slowly.

The horizontal axis refers to the invariant mass of the resonance, which then decays to two photons. The vertical axis in the top plot is just the number of events collected per unit mass. The plot just below then gives the residual between the data and the best-fit model. Given the experimental simplicity of the diphoton channel, the background modeling is a simple power law. You can see that there are roughly four adjacent mass bins each showing \(1\sigma-3\sigma\) fluctuations around 750 GeV. This is intriguing, but the real excitement comes with the news that CMS has seen a similar upward fluctuation at the same energy. The combined local significances are in the \(3\sigma-4\sigma\) ballpark, which is far from claiming discovery with the further suppression due to the trials factor. The trials factor comes in because if you stare at a large enough number of random variables, you are almost guaranteed to see any value you wish.

Looking at the history of particle physics, it is slightly discouraging to see that such fluctuations come and go from time to time. Therefore it is likely that the 750 GeV feature does not correspond to a new physical process, but rather is a result of an unlikely random fluctuation of the background. Interestingly in the past few months a huge number of papers have been written on models that try to explain the limited amount of information such as the resonance mass, its width and the fact that the excess was not clearly seen in the first run at 8 TeV. Although this burst of 750 GeV literature could be a type of ambulance chasing, there is always a chance that we are looking at a real signal. In the next two months ATLAS and CMS will update their results. We will then know whether we have been tracing the opening of a new chapter in particle physics or getting too excited about a statistical fluctuation. Fingers crossed!

Updated: