READ IN French
Physics / Experiment Review

Vol. 3, NO. 1 / April 2017

Natural Physics

Aurélien Barrau

Letters to the Editors

In response to “Natural Physics


The Large Hadron Collider (LHC) is an exceptional instrument by any measure. In its current configuration, the LHC is the most powerful particle accelerator ever built, the product of two decades of labor, and a capital investment amounting to at least thirteen billion dollars.1 Access to the intimate structure of matter comes at a price. The energy scales at which collisions take place, and the precision needed to probe their traces, require instruments of great size and complexity.

The LHC is housed within a ring of tunnels twenty-seven kilometers in circumference that spans the Franco–Swiss border, at a depth of between fifty and one hundred and seventy five meters beneath the surface. Within the LHC, beams of protons and antiprotons are accelerated to velocities just short of the speed of light, reaching an energy of 6.5 × 1012 electronvolts (eV) just short of collision. The energy level is crucial; the higher the energy, the smaller the particles that can be observed. Collisions between beams occur inside four detectors, the largest of which, ATLAS and CMS, weigh 7,000 and 14,000 tons respectively.

The production of new particles is governed by Albert Einstein’s famous formula E = mc2. Energy is transformed into mass, the kinetic energy of colliding particles giving rise to new particles. This is effectively what is happening at the LHC. Antoine Lavoisier’s assertion that “nothing is lost, nothing is created, everything transforms” is not true with respect to the elementary particles.2 Energy is the fundamentally conserved quantity.

Particle physics is founded on two fundamental principles. Gauge theory assigns a specific importance to internal symmetries, invariances that leave systems unchanged, and plays an essential role in unification, the key to high-energy physics. The goal of unification is to account for diverse phenomena using the fewest possible laws, or, even better, to show that laws that appear distinct are, in fact, the same. This was the basis for the unification of electricity and magnetism, and the subsequent unification of electromagnetism with the weak nuclear force. Bosons, which mediate the fundamental forces—photons in the case of electromagnetism, gluons for the strong nuclear interaction, the W and Z for the weak nuclear interaction—are associated with the localization of these global symmetries.

Spontaneous symmetry breaking represents the second of the two fundamental principles. This is what happens when a glass of water is chilled and freezes. The electric dipoles associated with the water molecules initially point in random spatial directions. No particular direction is globally privileged, and the system is rotationally invariant. When the water temperature falls below 0°C, the dipoles spontaneously align, a single privileged direction emerges, and the initial symmetry is broken. The phenomenon is random. A second glass placed in the same vicinity can align in a direction arbitrarily far from that of the first. A universe of particles can be unified at high energies and diversified at low energies.

The structure of particle physics is highly coherent. Matter is categorized into three families, each composed of two quarks and two leptons. When the bosons responsible for interactions are added, the only thing missing from a complete description is the Higgs field. This field, it is often argued, is the source of mass. This is only partially true. The majority of the mass of objects is associated with the binding energy of the strong nuclear force, far more so than with the Higgs field. The role of the latter is, in fact, more closely related to spontaneous symmetry breaking.

The discovery of the Higgs boson at the LHC was a remarkable achievement. This result affirmed both the predictive power of theoretical physics and the capabilities of experimental physics. Fifty years after the standard model was formulated, a remarkable prediction was verified. The extraordinary collective effort involved in this feat demonstrates the vitality of particle physics on a global scale.

Despite this success, a somewhat bitter aftertaste lingers. The field might appear, on the surface, at least, to be in rude health. But dig a little deeper, and doubts emerge. “The lack of new physics,” Natalie Wolchover has remarked, “deepens a crisis that started in 2012 during the LHC’s first run, when it became clear that its 8-TeV collisions would not generate any new physics beyond the Standard Model.”3 While experimental confirmation for expected results is necessary for the advancement of science, the LHC was also designed to reveal new physics. The LHC remains in operation, and, in the wake of the Higgs discovery, expectations remain high for the detection of unknown and unexpected phenomena.

Aside from the Higgs boson, nothing entirely new has been observed to date.

For a few brief months in late 2016, the particle physics community was gripped by the news that an unexpected signal had appeared at the LHC. At around 750 GeV (750 gigaelectronvolts, or 750 billion times the energy of normal light), a small excess of gamma rays could be observed. Emitted in pairs, this was a signal stronger than that predicted by the standard model. The finding was announced cautiously. By the following day, several articles offering possible theoretical interpretations had already been posted on the reference website arXiv.

In the months that followed, hundreds of attempts to interpret this small blip were published. Some were based on minimal modifications of existing theories while others offered bold, even revolutionary, hypotheses. Amid the hubbub, further data accumulated and was analyzed, allowing for an improved statistical evaluation. The observed excess was merely a random fluctuation.

There was not a trace of new physics.

There are a number of possible paths extending beyond the standard model that are currently being investigated at the LHC. The notion of leptoquarks has received considerable attention: particles that transport information between quarks (which form the nucleus of atoms) and leptons (electrons, for example). These appear in several proposed extensions of the standard model, most notably in the so-called technicolor, or grand unification, theories.

To date, evidence for leptoquarks has proved elusive.

Another avenue of inquiry is the search for additional dimensions. The idea that the world could be composed of more spatial dimensions than the three we experience is not new. It was first explored in an unsuccessful attempt to unify James Clerk Maxwell’s theory of electromagnetism and Einstein’s theory of general relativity.4 The advent of string theory, which predicted the existence of at least nine dimensions, led to renewed interest in the idea. Depending on their size and structure, additional dimensions may give rise to phenomena that can be observed with accelerators.

So far, nothing.

Mini black holes have been suggested as one of the consequences resulting from the existence of additional dimensions, although they are also conceivable under other circumstances. Their creation requires that the Planck energy, at which gravity becomes quantum, be much lower than usually expected. Such a phenomenon would be easily detectable. The black holes formed would evaporate almost immediately, the generated particles identifiable both by their nature and their characteristic directions of emission.5

Of these, there has been no sign either.

Supersymmetry has not fared much better. Its virtues are extraordinary. From a mathematical point of view, it is one of the only symmetries that can extend the standard model without violating fundamental principles. Supersymmetric theories predict that each particle must be associated with a symmetric superpartner. The idea is lovely. A matter particle’s partner will govern interactions and vice versa. Supersymmetry bridges the worlds of interactions and particles, which are essentially independent in the standard model.

From a pragmatic standpoint, supersymmetry has an immense advantage: it explains the mass of the Higgs boson. In the absence of supersymmetry, the mass of this boson should be immense. This is due to the fact that it is a scalar, which determines its transformation properties, and is therefore sensitive to corrections from quantum fluctuations. These corrections are very large, typically 100,000 billion times larger than the observed mass. It is possible, in principle, to add correction terms and make the prediction equal to the measurement. But this requires an incredibly fine adjustment. One must fine tune the Langrangian to fifteen digits. This is an artificial process, but not one ruled out by quantum field theory. If quantum theory does nothing to rule out such fine tuning, it does nothing to justify it either. Theorists of the standard model are still struggling with the mass of the Higgs. If supersymmetry is taken into account, the picture looks quite different. Supersymmetry is not exact; when it is broken below a certain energy scale the Higgs acquires its small observed mass.

If supersymmetry is both useful and lovely, it has a deeper theoretical justification. String theory remains the most extensively studied model of quantum gravitation. String theory and supersymmetry, now and forever, one and conceptually inseparable.

Supersymmetry is also useful in resolving the central question of unification. Coupling constants measure the intensity of an interaction between a particle and a field. These constants vary with energy. Within the standard model, there is no energy level at which all constants have the same value. Unity remains impossible. Yet when supersymmetry is taken into account, the constants overlap around a grand unification energy. There is still a chance that all forces may be derived from a single theory.

Supersymmetry has an additional advantage in astrophysical terms. Most of the mass in the universe is of an unknown nature—so-called dark matter. It is not made up of visible objects, nor is it formed from any of the particles that have been identified using particle accelerators. This puzzle of cosmology is a problem of particle physics. Supersymmetry offers precise predictions for the existence of heavy, stable, and weakly interacting particles; it is the most widely accepted view regarding the composition of dark matter.

The LHC has not completely ruled out supersymmetry, but results are far from encouraging. The simplest version of supersymmetry is known as the minimal supersymmetric model (MSSM). It is a model with real virtues. It makes possible the elimination of quadratic divergences in the standard model with a minimum number of parameters and additional fields. Each particle under the MSSM has a superpartner, except for the Higgs, which has two. The vast majority of phenomenological studies at the LHC have failed to confirm this theory.

One might imagine nature remarking waspishly, “Your ideas are beautiful, coherent, and attractive, but they are also wrong.”

What to make of all this?

Evidence drawn from the LHC has done nothing to support fashionable theories about science as a social construction. Some theorists had become accustomed to the attractive idea that their theories were correct in proportion to the degree to which they elicited assent. The LHC has reminded us that this is not the case. In thinking about energy thresholds beyond the capacity of the LHC, some physicists supposed that their imagination could stand in for what they could neither see nor measure. Au contraire. No matter how appealing the hypothesis or its mathematical elegance, if it is not in agreement with observation, to paraphrase David Hume, it must be consigned to the flames. The standard model was created in the 1970s and some physicists have never had the experience of confronting their work directly with new data.

Naturalness arguments must be treated with some caution. A relatively vague and ill-defined concept, naturalness is often used to support new theories, or judge competing theories. One can, according to Sean Carroll, distinguish the naturalness of a configuration from the naturalness of the laws that govern its evolution.6 Natural configurations can be quantified by the concept of entropy. No one expects to see a liquid spontaneously separate itself into scotch and soda. The second type of naturalness is more obscure. A theory is considered natural if its dimensionless parameters are close to unity. There are mathematical reasons to support this assumption, based on the renormalization group and field theory. Yet it remains rather arbitrary.

Supersymmetry conveys an additional implication about naturalness. Results from the LHC have ruled out supersymmetry in its simplest form. To the extent that being natural involves being simple, this would seem to suggest that nature, if it is not unwell, is at least unnatural. We must revise our expectations. It seems that in a natural universe, life, or, better, complexity in general, would be impossible, or at least extremely difficult to imagine. If the gravitational constant was, as one might have expected, naturally comparable to the other fundamental constants, quantum gravitational effects would be unending, and the existence of stable structures no longer assured.

Under these circumstances, three lines of inquiry should be considered.

The first option is to assume that there exists an as-yet-undiscovered natural mechanism that can explain the values of the fundamental constants. This is the preferred route, but it is far from an obvious choice. Why did nature choose precisely those values that are specifically adapted to the emergence of complexity?

Good luck is the second option. Among all the possible laws, we have come across the nearly unique solution leading to a complex universe. This is a less than convincing argument.

The third option is a universe in which the fundamental constants are reinterpreted as environmental parameters. Just as our local environment is not representative of the observable universe as a whole, the laws of nature, or the fundamental constants, might not be representative of the universe beyond what we can observe. As complex organisms, we find ourselves, naturally enough, in a specific zone where things just work out.

Mystery? What mystery?

Most physicists dislike this idea intensely. Some see it as arbitrary and unconvincing. But, this is not the case. Alan Guth and Andrei Linde’s theory of cosmological inflation is today widely accepted, and generically predicts the emergence of bubble universes. If these bubble universes contain any dynamic fields at all, then they may well contain different effective laws. This multiverse has been predicted by some of the theories commonly used in cosmology.

This is not proof, but it is not nothing either.

Then there is the idea that the multiverse violates Occam’s razor. It is, alas, not easy simply to define simplicity. The choice, David Lewis remarks, is one between many worlds versus many words.7 The multiverse is predicted by the simplest of our theories. In order to make it disappear, it is necessary to repair to theories that are far more complicated. Should we seek simplicity in the theory, or in the world? The answer is not obvious.

We must now welcome Karl Popper into the discussion. Because other universes are inaccessible, the multiverse is non-falsifiable; if non-falsifiable, then non-science. But this is not quite right. The existence of the multiverse is a consequence of well-defined and testable theories. If they are testable, they must be falsifiable. Had they been falsified, the multiverse would have long since disappeared as the cynosure of every physicist’s eye. It is indeed possible to make statistical predictions about the multiverse—and all physical predictions are, in one way or another, statistical. If the probability of different universes is known (not yet, but maybe), and if the frequency of observers within each is known as well (not yet, but could be), it is conceivable that we might compare the observed universe with the architecture of the multiverse. The work of the sciences is infinitely more subtle than any interpretation of Popper might suggest. Almost all our best current theories, as Leonard Susskind has observed, whether in biology or in physics, initially appeared unfalsifiable.8

We now find ourselves at a very pregnant pause. It is disappointing not to have found any signs of new physics. That much is beyond dispute. It is possible that nothing being studied is apt to work out. But all major scientific advances have required a reconsideration of the structure of scientific thought. There have been no major revolutions in physics for more than a century.

It is about time for something new.

Translated and adapted from the French by the editors.

Endmark

  1. Alex Knapp, “How Much Does It Cost to Find a Higgs Boson?Forbes, July 5, 2012. 
  2. In the original French: “Rien ne se perd, rien ne se crée, tout se transforme.” Antoine Lavoisier, Traité Élémentaire de Chimie, présenté dans un ordre nouveau, et d'après des découvertes modernes (Paris: Cuchet, 1789). 
  3. Natalie Wolchover, “What No New Particles Means for Physics,” Quanta, August 9, 2016. 
  4. Albert Einstein, “On the Generalized Theory of Gravitation,” Scientific American CLXXXII, no. 4 (1950): 13–17, doi:10.1038/scientificamerican0450-13. 
  5. A somewhat irrational but sociologically interesting concern has been raised in response to this hypothesis: would the earth be destroyed if Stephen Hawking’s prediction is proved wrong and the black holes did not evaporate? Accelerators, it should be noted, do nothing that nature does not do already; collisions at energies greater than those possible in the LHC take place continuously in the terrestrial atmosphere when it is bombarded by high energy cosmic rays. If black holes ready to swallow the earth could be produced, it would have happened a long time ago. 
  6. Sean Carroll, “Is Our Universe Natural?” Nature 440 (2006): 1,132–36. 
  7. David Lewis, On the Plurality of Worlds (Oxford: Blackwell, 1986). 
  8. Leonard Susskind and Lee Smolin, “Smolin vs. Susskind: The Anthropic Principle,” Edge: The Third Culture, August 18, 2014. 

Aurélien Barrau is a Professor of Astrophysics at the Université Grenoble-Alpes and Researcher at the Laboratoire de Physique Subatomique et de Cosmologie of the CNRS.


More from this Contributor

More on Physics


Endmark

Copyright © Inference 2024

ISSN #2576–4403