Advances in science, and particularly in physics, have often arisen from efforts to resolve small anomalies in accepted theories. Black-body radiation, which could not be adequately explained by classical physics, is an example. The notion of a universe undergoing accelerated expansion arose from the study of type Ia supernova. This, in turn, led to theories postulating the existence of dark energy.
The currently accepted fundamental theory of particle physics is the Standard Model, confirmed, until recently, by experiment after experiment. This epoch ended with the discovery of the Higgs boson. The Standard Model has well-known shortcomings. It provides no explanation for dark matter, and no explanation for the asymmetry between matter and anti-matter. To date, its flaws have not been visible in accelerator experiments.
This is now changing.
How then to discern the structure that will ultimately replace the Standard Model? The discovery of a particle not described by the Standard Model obviously would have revolutionary implications. The impatience of some theoretical physicists to find a new particle recently led them to believe that they had discovered a new particle with a mass of 750 GeV/c2. They were mistaken. What they saw was nothing more than a simple statistical fluctuation.
The study of indirect effects from a hypothetical new particle offers a different view of the high energy frontier. In assessing indirect effects, the extremely high energies necessary to create a new particle are unneeded; a very brief short-scale violation of conservation principles allows the particle briefly to pop into existence and pop out again. This is precisely what is predicted by the Heisenberg uncertainty principle. The virtual existence of a particle does not allow physicists unequivocally to determine its nature, only some of its properties, like a ghost known only for causing a chill.
It can happen that a B meson decays into a pseudo-scalar or vector particle and two light leptons. This type of decay is rare—on the order of one in ten million events. B meson decay is governed by a transition at the quark level, which in the Standard Model is generally prohibited at first order. These prohibitions may not apply to theories beyond the Standard Model. It thus makes perfect sense to look for new physics where B mesons decay. To analyze these decays, quantum chromodynamics is needed at both the perturbative and non-perturbative levels.
Although it is possible to study B-meson decay with respect to a specific model, it is much more efficient to use the language of effective field theories, which encompasses many models at a time. Effective field theories permit a systematic separation between short and long-range dynamics.
It is within short distances that one finds new physics.
Non-perturbative or hadron corrections include low energy gluons, or loops of charmed quarks, and are difficult to evaluate. They may mask signals of new physics. In the past, the study of B meson decay into a vector particle K* and a dilepton pair was mainly carried out by using such observables as the polarization fraction of the K* vector. These were very sensitive to hadron corrections. Form factors govern the transition between the B meson and the exiting vectorial particle, in this case a K*. These are calculated in a distinct way according to the energy region under consideration. Form factors exhibit a key property. At the limit, where the initial hadron is heavy and the final meson energetic, a network of relations emerges among them. These relations allow physicists to describe the form factors in terms of only two quantities—the so-called soft form factors. These quantities encode most of the information about the form factors themselves, and, of course, their evaluation is equally difficult.
In 2005, Frank Krüger and I proposed a new type of observable that would radically change the study of semileptonic B-meson decays.1 This idea led to a substantial reduction in the sensitivity of observables to hadronic effects. Semileptonic B decays could also be employed to efficiently search for new physics effects.
In 2013, Sébastien Descotes-Genon, Tobias Hurth, Javier Virto, and I built a complete base of six optimized observables, Pi.2 At the same time, experimental physicists at the LHCb (the Large Hadron Collider beauty experiment), in particular Nicola Serra, found the idea seductive, and began a long experimental study to see if they could measure them. At a 2013 conference, Serra noted that Bd → K*μ+μ– showed a deviation of 3.7 sigmas between the experimental measurement and the prediction of the Standard Model for , and ditto, a smaller deviation for P2. This deviation in became known as “the anomaly.” We then asked the question whether physics beyond the Standard Model could explain deviations in both P2 and . If so, what would be its statistical significance with respect to the Standard Model itself?
Together with Descotes-Genon and Virto, we completed the analysis of the complete base of observables, and found that the deviations were clearly consistent.3 A global fit allowed us to test distinct hypotheses, and to determine whether any explained the data better than the Standard Model. Our results were significant to 4 sigmas, and indicated that the C9 coefficient of an O9 4-fermion operator was responsible for the deviations in P2 and . Using the Standard Model, the value of this coefficient is approximately 4. Our analysis showed that predictions derived from fundamentally new physics interfered destructively with those derived from the Standard Model, reducing the value of this coefficient to 3.
The physics community initially argued that our results reflected a simple statistical fluctuation. This argument ignored the fact that deviations in P2 and were perfectly consistent. Later in 2015, the LHCb confirmed the anomaly. A second reaction was to search for an alternative within the Standard Model that could be responsible for these deviations. Hadron corrections became the focal point. These are the result of quark and gluon loops, and are difficult to calculate precisely. A careful analysis of these corrections by our group deconstructed the counterarguments based on form factors. Using a different combination of form factors, another group obtained results that converged perfectly with our own. There was a second more difficult problem that we addressed—charm quark loops giving rise to nonperturbative, long-distance corrections. Although we included it in our analysis, the only relevant calculation from the literature was insufficient to explain the anomaly. New physics was required.
An unexpected development then followed. The LHCb widened the window of exploration, including both muon and electron decays. Measurements undertaken by the LHCb, first in 2014 and more recently in April 2017, seemed to convey striking implications for the universality of gauge interactions in the Standard Model’s lepton sector. The Standard Model predicts that the decay of a charged B meson into a charged kaon and a lepton anti-lepton pair is the same regardless of whether the leptons are muons or electrons. It predicts that, above certain energy thresholds, the decay into muons or electrons occurs with roughly equal probability. Nevertheless, measurements of the ratio of the same semileptonic B decay to muons or electrons by the LHCb showed between a twenty and twenty-five percent deficit for muons.
These results have two significant consequences: in the first place, measurements suggest that lepton flavor is not universal; and in the second, that counterarguments raised by few groups in effort to explain the anomalies based on hadronic uncertainties are again foundering. If there is new physics affecting muons, and less so electrons, then the same strategy that explained the deviations in RK and RK* might also eliminate charmed loops as a way of explaining .
We recently carried out an experiment attempting to determine the Wilson coefficient, C9, using only violations of lepton universality. The result coincided with measurements of the observed anomalies in . This result weakens arguments using hadron uncertainties to explain . A direct consistency proof of lepton universality violations suggests that hypotheses drawn from new physics are for the first time preferable to hypotheses drawn from the Standard Model—and this to more than five standard deviations.
The Standard Model has begun to show its age. What may we now expect? It is entirely possible that in the coming months the LHCb will provide additional data about violations of lepton universality.
A new stage in particle physics may be in prospect.
Translated and adapted from the Spanish by the editors.