######### Card Hero LETTERS #########
Letters to the editors

Vol. 5, NO. 1 / December 2019

To the editors:

We disagree with John Norton’s arguments against Landauer’s principle.1 Rather than it being a hot mess, we find Landauer’s principle to be easily understood, easily derivable from basic principles of thermodynamics and statistical physics, and most importantly, in accordance with experimental tests of the same.2 Furthermore, if there is no entropic cost associated with erasure of information, then a Szilard engine could violate the second law of thermodynamics via a Maxwell demon.3

Norton’s main arguments can be summarized as follows:

  • “The logical specification of the erasure process does not require any compression of the phase space, only a relocation of which part is occupied.”
  • “Boltzmann’s formula is misused to ascribe an incorrect thermodynamic entropy to a memory device; and the entropy creation required to suppress the fluctuations necessitated by the formula is neglected.”
  • The fact that Landauer’s bound cannot actually be reached, together with a “no-go” theorem.

In this letter, we will address these objections, offer a simple illustration of the Landauer principle for an N-state memory, and discuss the limitations of Landauer’s principle.

Before proceeding, it is important to agree on a definition for the term “erasure” in relation to information. If we mean what Rolf Landauer was discussing in his original article—the “RESTORE TO ONE” operation by use of a thermal reservoir—then his principle follows very simply. Norton argues that this logical reset can be accomplished by relocating the part that is occupied. Landauer discusses this exact idea in his article and shows that no conservative one-to-one operation can achieve this outcome. To initialize the system, one must perform different operations, depending on the logical state of the system:

Note, however, that in order to avoid energy expenditure we have used two different routines, depending on the initial state of the device. This is not how a computer operates. In most instances a computer pushes information around in a manner that is independent of the exact data which are being handled, and is only a function of the physical circuit connections.4

Norton’s relocation definition is a conditional operation and rather than erasing the memory, it simply transfers the information to a higher level of a memory hierarchy. An operator must know the state of the system in order to apply the right operation to relocate it. The operator thus serves as the memory in another part of the computer. The important difference is that of process: with a single operation, any state of the memory is mapped to the fiducial state, rather than a specific state.

Landauer’s insight was that a dissipative process was needed to carry out this many-to-one operation. He employed thermodynamics, the simplest dissipative resource. We define erasure in Landauer’s sense of the word: the use of thermal resources to reset the memory of a computing element.

Before addressing Norton’s other objections, it is worth examining a simple illustration of Landauer’s principle. This example will serve as a reference for the rest of the points raised.

Consider a one-dimensional potential V0(x), such that there are N local minima, separated by potential barriers of height ∈ from the other local minima (Figure 1). This generalizes the binary digit memory discussed by Landauer, and later by Charles Bennett, to encoding an N-ary digit worth of information.5 Information will be stored in a particle residing in one of these potential wells. For a thermal reservoir with temperature T, the activation time over the barriers can be made exponentially long in the ratio of barrier height to temperature. It can then be neglected on the timescales of interest. The system begins in a logical state. It is not free to thermally explore the other wells, which correspond to other logical states. The particle is in a particular state resulting from a logical operation, but we wish to reset the system to a fiducial state—say, the far left well (Fig. 1)—using only thermodynamic resources. That is, moving the system between heat and work reservoirs for reasons already discussed.

Figure 1.

Demonstration of Landauer’s eraser protocol for a potential well with N = 6 local minima, where each local minimum represents a logical memory state. The inset makes a parallel to the more common free particle bit, shown here in a two-sided box with a movable divider.

This erasure of the system state is carried out in a series of steps (Fig. 1):

  1. The potential barriers dividing the wells are dropped to a level below kT, the product of the Boltzmann constant and the temperature, allowing the particle to explore all the other wells over a period of time, in contact with the thermal bath.
  2. A force is applied quasi-statically—e.g., with an electric field on a charged particle—such that the effective potential is Veff = V0(x) + eVe(x; t), forcing the particle over to the left well.
  3. The barriers dividing the wells are raised to their former positions.
  4. The leftward force is quasi-statically removed, returning the potential to the original configuration.

This is a well-defined operational procedure. Regardless of the prior state of the system, the system will almost certainly be in the fiducial state when the protocol ends. The work and heat needed from the thermal and work reservoirs to carry out this process can now be computed. The system begins in a determined and well-defined state in a given well, j, in contact with the thermal bath. In a partition function for that well,

Zj=x2x2exp-VxkTdx,

the Boltzmann constant is k and the boundaries of the well are denoted by x1 and x2. In the first step, the barrier can be lowered with negligible work on the particle. After the first step and thermalization is complete, the new partition function for a single particle is (approximately) Ztot;i = Z1 + Z2 + … + ZN. The system is described in global thermal equilibrium and the particle at any site can explore the other wells, given sufficient time. By expanding to all the wells, entropy increases without any additional work or heat. This results in a free energy change of

ΔF=-kTlnZtot;iZj=-kTlnN,

where the wells are taken to be identical in the last step for simplicity. This irreversible process is associated with an increase in entropy of ΔS = k ln N. The free energy is given by F = UTS, where U denotes the internal energy and the entropy is S. The second, third, and fourth steps are quasi-static and thermodynamically reversible. This reversible compression in state space is permitted. It is performed after a thermodynamically irreversible expansion that ensures logical irreversibility. After the quasi-static process is complete, the partition function in the final state for a single particle can be computed as Ztot;f = Z1. The basic principles of thermodynamics dictate that the entropy, internal energy, and Helmholtz free energy in these processes are state functions. That is, after taking a quasi-static path through the control landscape, the net work done and heat transfer depends only on the boundaries of the system and not on the path taken. The change in Helmholtz free energy in the second, third, and fourth steps is given by

ΔF=-kTlnZtot;fZtot;i=kTlnN=W,

where W is the work done in the quasi-static process, because it is isothermal.

While the change in free energy is equal and opposite to the first step of the process, the physical interpretation is quite different. Previously, the change in free energy was associated with an increase in entropy while the internal energy remained unchanged. For the quasi-static isothermal process, the change in free energy is given by the work done on the system. From the first law of thermodynamics, dU = dQ + dW = TdS + dW. In the case of quasi-static processes, work is being done on the system and heat is being expelled into the environment. The system ends in the same local thermal state in which it begins, with the only exception being a spatial translation by some amount that differs for each possible initial logical state. The change in internal energy must be zero through the process, and, by the first law, the heat expelled to the environment is the same as the work done on the system, kT ln N. For a thermal reservoir, this corresponds to an increase in entropy of k ln N. This conclusion is quite general and does not depend on the detailed shape of the potential wells, or the path taken in the control space, so long as it is sufficiently slow.

Consider some of Norton’s other objections. “This treatment of the memory state as if it were the thermalized state,” he writes, “is pervasive among researchers.” His comment is referring to the state of the system before the first step. The system is in a well-defined local thermal state, with a partition function Zj. The partition function is calculated with the potential wells raised sufficiently high so that the particle can only explore that particular well and not the rest of the system. The system of partitions can then act as a coarse-grained logical memory. Other treatments emphasize this by simply putting a partition between the sub-systems.

In his essay, Norton offers a no-go theorem. The proof is simply the observation that a thermodynamic process that begins and ends with the same entropy has the same microscopic probability of those microstates. In general, entropy must increase if a substantial probability of successful transition into one fiducial state is required. Landauer’s principle is a lower bound on the amount of energy dissipated, and in practice erasure usually wastes much more energy. Silicon-based computing devices exceed this bound by a factor of about a thousand.6 It is not hard to find cases where the energy dissipated by erasure procedures exceeds this bound. Landauer also points out that if one wants to erase quickly, additional energy will need to be dissipated. Norton goes further by saying the bound cannot be reached, even in principle. The counterexample given above, which correctly connects the canonical partition function to the free energy, demonstrates that it is, in fact, reachable. Norton overlooks the use of external work to reduce the entropy of the system state to its original value, rather than relying on fluctuations. There is also the fact that thermodynamics requires the entropy of both system and environment to increase or stay the same, not the system alone. In his example, the process has not yet finished a cycle. A change in potential energy gives both a change in entropy as well as a change in the internal energy of the system.

As shown above, the Landauer bound can be reached using standard statistical mechanical methods and thermodynamic arguments. Many studies in the literature offer alternative derivations. We have used a thermodynamically irreversible operation to erase the memory, but operations that are microscopically reversible can also be used. Kurt Jacobs has shown that different system states can be mapped reversibly to different microstates of the environment.7 This amounts to erasure of the system memory, because only the macroscopic (thermodynamic) variable are accessible, and the procedure cannot be thermodynamically reversed. The information is not really lost; it is only encoded in inaccessible microstates of the thermal environment corresponding to the same macrostate, increasing its entropy by kT ln N.

Although Landauer’s principle is well established using thermodynamic resources, there are limitations beyond this paradigm. If information about the observer and the disturbance of the measurement are incorporated, the principle can be generalized.8 It can also be violated if non-thermal resources are used, such as a squeezed thermal bath,9 although there are costs associated with preparing that resource. Indeed, in a fully quantum situation, it is possible to make cyclic engines using only quantum measurement and feedback, making no use of thermal reservoirs at all.10

Andrew Jordan & Sreenath Manikandan

John Norton replies:

In their letter, Andrew Jordan and Sreenath Manikandan describe Landauer’s principle as “easily derivable from basic principles of thermodynamics and statistical physics.” Yet the argument they offer goes beyond those basic principles. The goal is to establish that an N-state memory device holding random data is thermodynamically like a single-particle gas in a chamber of volume N. The isothermal, reversible compression of the single-particle gas to unit volume passes entropy of k ln N to the heat bath. If the analogy holds, the erasure of the memory device is associated with the same passage of entropy.

The difficulty is that the two systems are not thermodynamically analogous. The uncompressed gas has an entropy that is k ln N greater than that of its compressed state. The memory device holding random data has the same entropy as the erased state. Purely thermodynamic reasoning cannot override this difference. To presume otherwise is to commit the misattribution of entropy described in the main article.

To bridge the gap, Jordan and Manikandan follow the standard literature in employing an argument that goes beyond ordinary thermodynamics. They prescribe an erasure procedure envisaged by Rolf Landauer and others. It begins by thermalizing the memory device, as described in the main article. This irreversible expansion is the real source of the k ln N entropy created in the erasure procedure. The expansion is needed since otherwise, Jordan and Manikandan argue, the erasure must employ different procedures according to the content of the memory device. These different procedures require, they further argue, a computer or its operator to have some memory of the data to be erased, which would mean that the data is merely displaced rather than erased.

This is not a principled, thermodynamic argument. Rather, it is speculation about how all possible erasure devices must operate. It relies on intuitions based on the particular way that modern digital computers operate. In support, Jordan and Manikandan quote Landauer in his original paper remarking on how a computer operates “in most instances.” Mere familiarity is not necessity. The argument fails to demonstrate that all erasure processes must operate in this way. If one is willing to proceed in this loose, speculative manner and neglect fluctuations, one can also imagine protocols that do not require such memory records.11 Whether these protocols are admissible or not does not really matter. The positive but unmet burden of proof rests on proponents of Landauer’s principle. The principle remains now as it was when Landauer first proposed it: a speculation without general proof, based on loose plausibility arguments. Over half a century later, can we not do better?12

This is not the most important of our differences. My view differs from that of Jordan and Manikandan in the import of fundamental principles. They regard thermal fluctuation in molecular scale processes as eliminable distractions. All it takes to control them arbitrarily well is a suitably ingenious design of the process. Thermodynamically reversible processes can then be implemented in molecular scale systems. I view thermal fluctuations—thermal noise—as an ineliminable feature of molecular scale processes that derive from a principle, Boltzmann’s S = k ln W. Fluctuations arise inevitably from the molecular constitution of all thermal processes. In principle, their entropically costly suppression is unavoidable if any process is to be brought to completion on molecular scales. Even then, completion can only be assured probabilistically. The relation between the entropy creation required and the probability of completion is quantified in the no-go result. The result is quite general and applies to any process whatever at molecular scales. It is all too easy to underestimate the reach of this last statement.

My concern, expressed in the main article and elsewhere, is that literature on the Landauer principle has ignored this ineliminable source of entropy creation, rendering much of their analyses spurious. Jordan and Manikandan provide an illustration of this neglect. They describe a molecular scale erasure process to demonstrate their claim that the Landauer limit is achievable in spite of thermal fluctuations. The process they describe is familiar and a standard in the literature. I have elsewhere given an extended analysis of it for the one-bit case, but without considering fluctuations.13 These considerations can be added now. It will be demonstrated that, as a matter of principle, the entropy creation required to suppress fluctuations precludes realization of the Landauer limit.

In their account, Jordan and Manikandan’s erasure process consists of four steps, each completed before the next starts. The second, third, and fourth of these are executed reversibly. They are

  1. Isothermal N-fold compression of a one-particle gas.
  2. Raising of barriers to divide a chamber into N sections.
  3. Removal of the mechanism producing the compressive force, to restore the original configuration.

Since each step is to proceed reversibly, its components are thermal and arbitrarily close to equilibrium with the other components. The no-go result applies to each step individually. Probabilistic completion of each will require creation of thermodynamic entropy in accord with the no-go result formula.

In the case of steps 3 and 4, it is hard to say more since few details are provided for their implementation. But these details are not needed. All that matters is that, in both steps, the erasure process design calls for moving some system through a sequence of states arbitrarily close to equilibrium states of equal entropy, and toward some desired end state. The no-go result applies.

More details are provided for step 2. The particle is assumed to be electrically charged. It is compressed into 1/Nth of the chamber volume by a suitably manipulated electric field. All fields inside computers are thermal systems and exhibit thermal fluctuations. Albert Einstein demonstrated the necessity of these fluctuations.14 He showed that without them, processes could exist in which the thermal energy of a particle could be completely converted into thermal radiation, in violation of the second law. The thermal fluctuations in the field are needed to enable thermal equilibrium and preserve the second law. Similarly, any device used to manipulate the field is also a thermal system and will exhibit thermal fluctuations. Might some electric circuitry be involved? Such circuits are subject to Johnson–Nyquist noise, which is the electrical engineer’s term for thermal fluctuations in circuitry.

To achieve reversibility, the mean of the compressive force provided by the field and supporting circuitry must match almost exactly the mean of the compression-resisting force of the one-particle gas. Superimposed over these means will be the fluctuations that will indifferently advance and retard the process of compression. The size of the fluctuations in the one-particle gas has already been computed in the main text. A thermal particle with three degrees of freedom has a mean energy of 1.5kT with a root mean square spread of 1.22kT. That is, its energy will be fluctuating widely, at least over the interval 0.28kT to 2.72kT. The gains and losses in energy derive from rapid energy exchanges with any system with which it interacts, including the heat bath and the compressing field. To suppress these fluctuations and assure probabilistic completion of the compression, a mean compressive force significantly greater than the mean resistive force must be applied. The ensuing imbalance of forces is dissipative and leads to the creation of entropy. Once again, the no-go result gives the precise relation between the entropy created and the probability of completion.

To give a more complete analysis of the fluctuations in the electric field and its supporting circuitry would be complicated and possibly prohibitively so. Fluctuations can be observed disrupting the compression in a much simpler case in which gas is compressed in a cylinder by a weighted piston. For a balance of mean forces to be achieved, the weight on the piston must be very small, since the thermal motions of a single particle exert a very weak pressure. Thermal fluctuations in a very light piston lead it to jump about in motions comparable to the thermal motions of the particle in the one-particle gas. Completion of the compression can only be secured probabilistically by employing a weightier piston. The balance of mean forces is lost and the compression is dissipative, creating entropy.15 To preclude confusion, a popular but failed attempt at escaping the effects of fluctuations is to imagine that the one-particle gas is compressed by a very slowly moving, very massive piston. As I explain elsewhere, such a system is far from thermodynamic equilibrium, so the process is not thermodynamically reversible.16

The practical experience of nanotechnology is that all nanoscale devices are beset by thermal fluctuations, electrical noise, or whatever other name it may have. Suppressing the disruptive effects of fluctuations by thermodynamically dissipative processes is a controlling consideration in design. For Jordan and Manikandan, this is merely a technical nuisance. As far as matters of thermodynamic principle are concerned, they say, dissipation is only necessitated when logically irreversible processes are required. They are, I have argued, mistaken. The existence of thermal fluctuations and the high thermodynamic cost of suppressing them is unavoidable. It is grounded in a most important principle of thermal and statistical physics, Boltzmann’s S = k ln W. What is notable is that the formula is independent of temperature. Whatever benefits may be gained by cooling a system, the amount of entropy creation S depends only on the probability of process completion and not on its temperature.


  1. Rolf Landauer, “Irreversibility and Heat Generation in the Computing Process,” IBM Journal of Research and Development 5 (1961): 183–91. 
  2. See, e.g., Antoine Bérut et al., “Experimental Verification of Landauer’s Principle Linking Information and Thermodynamics,” Nature 483, no. 187 (2012); Jonne Koski et al., “Experimental Observation of the Role of Mutual Information in the Nonequilibrium Dynamics of a Maxwell Demon,” Physical Review Letters 113, no. 030601 (2014); John P. Peterson, et al., “Experimental Demonstration of Information to Energy Conversion in a Quantum System at the Landauer Limit,” Proceedings of the Royal Society A 472, no. 20150813 (2016). 
  3. A nice review is given in Koji Maruyama, Franco Nori, and Vlatko Vedral, “Colloquium: The Physics of Maxwell’s Demon and Information,” Reviews of Modern Physics 81, no. 1 (2009). 
  4. Rolf Landauer, “Irreversibility and Heat Generation in the Computing Process,” IBM Journal of Research and Development 5 (1961): 183–91. 
  5. Charles Bennett, “The Thermodynamics of Computation—A Review,” International Journal of Theoretical Physics 21, no. 12 (1982): 905–40. 
  6. Michael Frank, “The Physical Limits of Computing,” Computing in Science & Engineering 4, no. 3 (2002): 16. 
  7. Kurt Jacobs, “Deriving Landauer’s Erasure Principle from Statistical Mechanics,” (2005), arXiv:quant-ph/0512105v1; Kurt Jacobs, Quantum Measurement Theory and Its Applications (Cambridge, UK: Cambridge University Press, 2014). 
  8. Takahiro Sagawa and Masahito Ueda, “Minimal Energy Cost for Thermodynamic Information Processing: Measurement and Information Erasure,” Physical Review Letters 102, no. 250602 (2009). 
  9. Jan Klaers, “Landauer’s Erasure Principle in a Squeezed Thermal Memory,” Physical Review Letters 122, no. 040602 (2019). 
  10. Cyril Elouard and Andrew Jordan, “Efficient Quantum Measurement Engines,” Physical Review Letters 120, no. 260601 (2018). 
  11. For illustrations, see Section 5.2 of John Norton, “Waiting for Landauer,” Studies in History and Philosophy of Modern Physics 42 (2011): 184–98. 
  12. On the experimental tests of the Landauer principle, I have expressed my disappointment in how little the experiment by Antoine Bérut et al. actually shows. See Antoine Bérut et al., “Experimental Verification of Landauer’s Principle Linking Information and Thermodynamics,” Nature 483 (2012): 187–89, and Section 3.7 of John Norton, “All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation,” Entropy 15 (2013): 4,432–83. 
  13. See Section 2 in John Norton, “Eaters of the Lotus: Landauer’s Principle and the Return of Maxwell’s Demon,” Studies in History and Philosophy of Modern Physics 36 (2005): 375–411. 
  14. Albert Einstein, “Zum gegenwärtigen Stand des Strahlungsproblems,” Physikalische Zeitschrift 10 (1909): 185–93. 
  15. I have calculated this case in greater detail in Section 7 of John Norton, “Waiting for Landauer,” Studies in History and Philosophy of Modern Physics 42 (2011): 184–98, and in John Norton, “Thermodynamically Reversible Processes in Statistical Physics,” American Journal of Physics 85 (2017): 135–45. 
  16. See Section 3.7 of John Norton, “All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation,” Entropy 15 (2013): 4,432–83. 

Andrew Jordan is professor of physics at the University of Rochester.

Sreenath Manikandan is a PhD student in the Department of Physics and Astronomy at the University of Rochester.

John Norton is Distinguished Professor in the Department of History and Philosophy of Science at the University of Pittsburgh.

More Letters for this Article


Endmark

Copyright © Inference 2024

ISSN #2576–4403