Physics / Experiment Review

Vol. 3, NO. 1 / April 2017

Higgs on the Moon

Adam Falkowski

Letters to the Editors

In response to “Higgs on the Moon


At the beginning of the twentieth century, Ernest Rutherford uncovered the structure of an atom through a series of experiments in which metal foil was bombarded by a beam of energetic particles.1 Given that the size of an atom is just 1010 meters, and its nucleus is one hundred thousand times smaller, this was an incredible feat by any standards. We have gone much further since.

By increasing the energy and intensity of the particle beams, and perfecting their detection apparatus, researchers have found even smaller constituents of matter. A host of elementary particles have been discovered in recent decades, including new forms of matter, such as the tau lepton or the bottom quark, and new force carriers such as the W and Z bosons. Current experiments can directly resolve distances at scales of 1019 meters. Precision measurements allow us to peek at even smaller distances. Experiments searching for proton decay indirectly probe physical phenomena at distances on the scale of 1032 meters.2

Progress in the realm of theory has been equally impressive. Rutherford’s discovery led to the development of quantum mechanics. Reconciling quantum mechanics and Albert Einstein’s theory of special relativity yielded quantum field theory, a powerful formalism for calculating the properties and interactions of fundamental particles. The formulation of the Yang–Mills theories in 1954, together with relevant experimental evidence, culminated in the construction of the Standard Model, a complete, consistent theory of matter and its fundamental interactions.3

Five decades later, the Standard Model still remains our best theory. Its predictions have been tested and confirmed in countless experiments at different energy scales. The magnetic moment of the electron can be measured with an accuracy of 1013.4 The same quantity can be calculated theoretically using the Standard Model. The result agrees perfectly with the observations. The same is true of many other observables, and the Standard Model is almost always victorious.

Experimental anomalies have been observed that could be interpreted as evidence for the existence of particles or interactions beyond the Standard Model. These have turned out to be false alarms due either to a fluctuation in the data, experimental error, or incomplete theoretical calculations. A handful of anomalies persist, most notably in the measurement of the magnetic moment of the muon, but once more precise data are collected it is not a stretch to think that they too will be resolved.5

In the summer of 2012, researchers at the Large Hadron Collider (LHC) announced that they had observed the unequivocal signature of the Higgs boson, a cornerstone of the Standard Model.6 The existence of the Higgs establishes that the theory is internally consistent up to extremely high energies. All the particles predicted by the Standard Model have now been observed, and all eighteen of its free parameters are known with some accuracy. The results indicate that the Standard Model is valid to distances at scales of at least 1019 meters.

One might imagine that the triumph is complete.

It may then come as a surprise to learn that particle physics is currently experiencing the most serious crisis in its storied history. The feeling in the field is at best one of confusion and at worst depression. How could this be? A complete theory perfectly describing experimental data is surely cause for celebration! Not so.

The theory may in fact be incomplete. If one takes into account the gravitational force, the Standard Model ceases to be a useful calculation tool at extremely high energies. Given that these energies are 1016 times greater than those available at the LHC, and 1014 times greater than the most energetic cosmic rays recorded on Earth, this is not in itself a pressing problem. We could for the time being live with a theory that is sufficient for practical purposes.

The Standard Model also cannot account for some observed features of our universe. It cannot explain the existence of dark matter and the absence of a vast amount of antimatter.7 Dark matter should be comprised of a stable particle that interacts weakly with ordinary matter and photons. The Standard Model does not offer a suitable candidate; for subtle quantitative reasons, neutrinos are unsuitable. The preponderance of matter over antimatter may, in fact, result from a tiny difference in their interactions, which is a feature of the Standard Model. Here though, the Standard Model fails again at the quantitative level. Its parameters fail to account for the generation of a sufficiently large matter-antimatter asymmetry in the early universe.

There must be a more fundamental theory.

Certain aspects of the Standard Model remain extremely puzzling. Calculations indicate that the observed mass of the recently discovered Higgs boson is highly unlikely; quantum fluctuations should have shifted this mass upwards. It seems that, after all these years, our understanding of quantum field theory is flawed.

Theoretical physicists have proposed many candidates to replace the Standard Model.8 The most popular postulate a new symmetry between bosons and fermions: supersymmetry. This requires a multitude of new particles, some of which could play the role of dark matter. Another popular idea is that the Higgs boson is a composite particle made of new quarks bound by a new strong force. These are consistent quantum theories, but there is not yet a reliable criterion to judge which (if any) of them is realized in nature. Subjective aesthetic criteria have previously been applied that favor certain realizations of string theory. Researchers have also appealed to naturalness: parameters should not require fine-tuning such that large cancellations between various contributions to the Higgs mass are required to explain its observed value. These approaches seem misguided. Nature does not seem to conform to their predictions. Further experimental clues are desperately needed. None have been forthcoming from the LHC to date, even though it has almost reached its maximum energy.

The possibility that the LHC will only further confirm the Standard Model is often referred to as the nightmare scenario. The puzzles that emerge are not the nightmare; physicists love difficult problems. On the contrary, it is the indefinite persistence of the current confusing situation that is considered nightmarish.

The most efficient method developed thus far for revealing the fundamental secrets of nature has been to increase beam energy to probe increasingly small distances. Larger and more powerful colliders are seen as the solution. The design and construction of the LHC was a gargantuan task that required decades of work and billions of dollars. Such an undertaking will only become more difficult in the future. Would a doubling of energy be sufficient for any new collider project? Is a factor-of-ten increase needed? It may be the case that the answers we seek are to be found at energy levels that are simply unattainable for the foreseeable future. It is also unclear whether a bigger collider would resolve currently unanswered questions.

Is our century-long exploration of the high-energy frontiers coming to an end?

Following the discovery of the Higgs boson, such a question might seem blasphemous. History, on the other hand, is littered with examples of research programs that were at one time or another deemed important but were eventually scaled down, suspended, or abandoned. Consider Christian scholastic theology. In its heyday, the greatest minds of Western civilization were occupied with the problem of proving the existence of God. The tools available were inadequate, and the field ultimately reached a dead end.

A more recent and relevant analogy can be found in the history of manned spaceflight. Founded in 1961, NASA’s Apollo program culminated eight years later in the first manned mission to the moon. This amazing technological feat, achieved only after an enormous investment of resources, seemed at the time to herald a new era in human history. Sadly, the moon landing has been, to date, the apogee of manned space exploration, rather than just the beginning. Runaway costs have been one factor in curtailing spaceflight programs. A failure to define realistically achievable goals has been another. Apollo 17, the final manned mission to the moon, was launched in 1972. In the forty-five years since, mankind has not ventured again beyond the earth’s low orbit.

After the moon missions were discontinued, the space program was downsized. NASA then elected to allocate a large portion of their budget to the space shuttle program: reusable rockets that could carry humans and large payloads into low earth orbit. From the need to find a purpose for the space shuttles came the idea for an international space station (ISS), for which the shuttle fleet would be essential for construction and resupply. The ISS has become an incredibly expensive low-earth-orbit laboratory hosting experiments that are, for the most part, uninteresting. Some exceptions, like the AMS (Alpha Magnetic Spectrometer) cosmic ray detector, could have been launched on independent satellites, at a fraction of the cost, without any impact upon the physics program.

Hundreds of billions of dollars have been spent on the ISS without clearly defined goals, and without significant scientific or technological advances.9 In the absence of an ISS or space shuttle program, there is no guarantee, of course, that the funds would have been spent elsewhere on innovative scientific experiments. What may well prove to be more damaging to science in the long run is the wasted time. Thousands of talented scientists and engineers have devoted their careers to projects that led nowhere. For the near future, at least, the prospects for manned spaceflight are intertwined with the fortunes of the private companies set up to offer private flights for rich tourists.

On the margins of the failed manned spaceflight program, NASA, together with the European Space Agency (ESA), developed a range of autonomous robotic missions. Probes have searched for water on Mars, dived into the atmosphere of Jupiter, landed on a comet, and passed by Pluto. Sophisticated satellite-based instruments have also had a huge impact on fundamental physics. The Cosmic Background Explorer (COBE), the Wilkinson Microwave Anisotropy Probe (WMAP), and the Planck satellite have provided insights into the very early universe, dark matter, and dark energy. These projects were all much smaller and cheaper than the manned spaceflight program.

The LHC experiment is not yet done—far from it. The collider is scheduled to operate for another fifteen years, accumulating something like one hundred times more data than it has to date. Physicists will be scrutinizing the data for any signs of new particles. The Higgs boson will be studied with an eye towards determining how well it conforms to the specific predictions of the Standard Model. A breakthrough may happen at any moment.

The nightmare scenario still looms.

It is now a realistic possibility that the LHC will not provide any unambiguous answers to the question of what, if anything, might lie beyond the Standard Model, but will instead leave us with a number of confusing puzzles. On this point, researchers differ only in their estimates of the probability of such an outcome.

CERN has recently begun planning the successor to the LHC, a one-hundred-kilometer-long collider that will be able to smash protons with an energy of 100 TeV, seven times larger than that now available at the LHC. The new collider is currently scheduled for completion in twenty years.

Some caution should be exercised with these predictions. Recent experience suggests that early timescale projections should be multiplied by at least two. Even more worrying is the lack of clearly defined goals. While the LHC was guaranteed to discover the Higgs boson or its theoretical alternatives, no single puzzle has been identified that the new collider would be certain to address conclusively.10 Some of the goals that have been mentioned include expanding searches for supersymmetry and dark matter.11 But there is no convincing theoretical argument that these phenomena will be found at energy scales beyond the reach of the LHC but within the capabilities of a new collider. The only identified goal thus far is a better determination of the shape of the potential energy of the Higgs field (more precisely, measuring the cubic Higgs boson self-coupling). Is this enough to justify a huge investment? It should be noted, of course, that future discoveries at the LHC or elsewhere may change the scope of the discussion and help shape the goals for a new collider. At the moment, convincing arguments are in short supply. As was the case with the space program, there is a danger that generations of physicists will become entangled in a project that leads nowhere.

As was the case with the space program following the moon landing, there is on the one hand a grandiose plan, and on the other, more modest proposals with clear goals. There are two ways to investigate matter at very small scales. The first is to channel vast energies into a small volume, so that it can be converted into the creation of new particles. This is the most straightforward method and results can be interpreted with minimal ambiguity.

The second approach involves taking advantage of the uncertainty principle in quantum mechanics. According to this principle, very heavy particles can be continuously created from a vacuum. These particles exist for a short time, during which they may affect the behavior of known particles, such as electrons, muons, Higgs bosons, and so on. By measuring the properties of known particles with great precision, and comparing the results to theoretical predictions, insights can be derived into physical laws at energies inaccessible to collider experiments.

Many such experiments are currently being conducted. Examples include research into the magnetic and electric properties of elementary and composite particles, such as muons, tau leptons, protons, neutrons, and kaons. The MEG, Muon g-2, nEDM, NA62, and Qweak experiments, among others, indirectly probe physics at energies well above what can be reached at the LHC, or, for that matter, a future one-hundred-kilometer collider. In many cases, a modest budget can improve precision by orders of magnitude within just a few years. Precision experiments also touch upon many distinct areas of physics—atomic physics, laser physics, condensed matter physics, and nuclear physics—promoting collaboration between particle physics and other domains of science.

It is somewhat trickier to sell a one-hundred-kilometer tunnel as a scientific innovation.

Which is it to be: a one-hundred-kilometer collider, or one hundred precision experiments at CERN? This is a serious question. Not only the future but possibly the survival of particle physics is at stake. Shifting the focus away from high-energy colliders toward precision experiments may be the most efficient way to continue exploration of fundamental interactions in the decades ahead. It may even allow particle physics to emerge stronger from its current crisis.

Endmark

  1. Ernest Rutherford, “The Scattering of α and β Particles by Matter and the Structure of the Atom,” Philosophical Magazine 6, no. 21 (1911): 669–88, doi:10.1080/14786440508637080. 
  2. K. Abe et al., “Search for Proton Decay via pe+π0 and p → μ+π0 in 0.31 Megaton Years Exposure of the Super-Kamiokande Water Cherenkov Detector,” Physical Review D 95, 012,004 (2017), doi:10.1103/PhysRevD.95.012004. 
  3. Steven Weinberg, “A Model of Leptons,” Physical Review Letters 19, 1,264 (1967): 1,264–66, doi:10.1103/PhysRevLett.19.1264. 
  4. Peter Mohr, David Newell, and Barry Taylor, “CODATA Recommended Values of the Fundamental Physical Constants: 2014,” Reviews of Modern Physics 88, 035,009 (2016), doi:10.1103/RevModPhys.88.035009. 
  5. G. W. Bennett et al., “Measurement of the Negative Muon Anomalous Magnetic Moment to 0.7 ppm,” Physical Review Letters 92, 161,802 (2004), doi:10.1103/PhysRevLett.92.161802. 
  6. See ATLAS Collaboration, “Observation of a New Particle in the Search for the Standard Model Higgs Boson with the ATLAS Detector at the LHC,” Physics Letters B 716, no. 1 (2012): 1–29, doi:10.1016/j.physletb.2012.08.020; CMS Collaboration, “Observation of a New Boson at a Mass of 125 GeV with the CMS Experiment at the LHC,” Physics Letters B 716, no. 1 (2012): 30–61, doi:10.1016/j.physletb.2012.08.021. 
  7. Planck Collaboration, “Planck 2015 Results. XIII. Cosmological Parameters,” Astronomy & Astrophysics 594, no. A13 (2016), doi:10.1051/0004-6361/201525830. 
  8. Mauricio Bustamante, Leandro Cieri, and John Ellis, “Beyond the Standard Model for Montaneros,” CERN Yellow Report 1 (2010): 145–228, doi:arXiv:0911.4409v2. 
  9. For an overview of the ever-changing and poorly defined objectives for the ISS, see Dwayne Day, “Twenty-Five Gigabucks of Steel: The Objectives of the International Space Station,” The Space Review, June 13, 2005. 
  10. Benjamin Lee, Chris Quigg, and Hank Thacker, “Weak Interactions at Very High Energies: The Role of the Higgs-Boson Mass,” Physical Review D 16, no. 1,519 (1977). 
  11. Nima Arkani-Hamed et al., “Physics Opportunities of a 100 TeV Proton-Proton Collider,” Physics Reports 652 (2016): 1–49, doi:10.1016/j.physrep.2016.07.004. 

Adam Falkowski is a theoretical particle physicist at the Laboratoire de Physique Théorique d’Orsay.


More on Physics


Endmark

Copyright © Inference 2024

ISSN #2576–4403