Physics / Critical Essay

Vol. 4, NO. 4 / July 2019

In his 1961 paper, “Irreversibility and Heat Generation in the Computing Process,” Rolf Landauer speculated that there exists a fundamental link between heat generation in computing devices and the computational logic in use.1 According to Landauer, this heating effect is the result of a connection between the logic of computation and the fundamental laws of thermodynamics. The minimum heat generated by computation, he argued, is fixed by rules independent of its physical implementation. The limits are fixed by the logic and are the same no matter the hardware, or the way in which the logic is implemented. His analysis became the foundation for both a new literature, termed “the thermodynamics of computation” by Charles Bennett,2 and a new physical law, Landauer’s principle.3

Landauer’s original paper was speculative. The results were made plausible, but not demonstrated. Although employed widely, these claims are, at best, supported by flawed argumentation and, at worst, in contradiction with standard thermal and statistical physics. Despite numerous attempts, these difficulties have never been resolved.

Many of the problems can be traced to a misapplication of Boltzmann’s formula, S = k ln W, and, in particular, a failure to recognize the dynamical character of the probability W.

An Enticing Proposal

The propositions arising from Landauer’s paper can be enumerated as follows:

  1. The minimum heat generated by a computation is determined by its logical character, independently of hardware or procedures used.
  2. Logically irreversible computations, such as erasure, necessitate heat generation—logically reversible computations do not.
  3. Logically irreversible computations must be implemented using thermodynamically irreversible processes—logically reversible computations need not.
  4. The erasure of an n-bit memory device reduces the number of memory states from 2n to one, corresponding to a 2n fold compression of the device’s phase space.
  5. Since each of the 2n states of an n-bit memory device is equally likely, erasure of the device changes the state of probability from W = 1/2n to W = 11.
  6. When it is erased, the decrease in thermodynamic entropy S of a memory device can be computed from Boltzmann’s formula, S = k ln W. It follows that erasure changes the entropy of an n-bit memory device by ΔS = k ln 1 – k ln (1/2n) = –nk ln 2. The second law of thermodynamics prohibits any decrease in total thermodynamic entropy, meaning that this change must be compensated for by an increase in environmental entropy of at least nk log 2. The Clausius definition of entropy associates these changes with heat transfers: the change in thermodynamic entropy dS in a system is dS = dQrev/T, where dQrev is the heat passed to the system in a thermodynamically reversible process.
  7. Applying the Clausius definition to the process, the erasure of an n-bit memory device is accompanied by the transfer of at least Q = TΔS = nkT ln 2 of heat to the environment.

This collection of claims has become part of the standard repertoire of modern physics under the rubric of Landauer’s principle.

Dynamic and Non-Dynamic Probabilities

Landauer’s proposal was based on Boltzmann’s formula, one of the most robust relations in thermal physics. This relation has a remarkable range of applications and has been the starting point for many important studies. Indeed, as a general rule, upon encountering a probability W in a thermal system, Boltzmann’s formula can be reliably used to assign a value to the thermodynamic entropy of the system. But even an expansive relation such as this has limits to its applicability. Landauer’s proposal breaches those limits: Boltzmann’s formula is misused to ascribe an incorrect thermodynamic entropy to a memory device; the entropy creation required to suppress the fluctuations necessitated by the formula is neglected.

The applicability of Boltzmann’s formula is limited by the type of probability employed. W denotes a dynamic probability with a value determined by the dynamics of the system. A thermal system migrates dynamically over many accessible states. The dynamic probability of a state is the fraction of time the system will spend in that state in the limit of infinite time, which is fixed by the dynamics. Nondynamic probabilities, on the other hand, are fixed independently from the dynamics of the thermal system. These probabilities may be fixed as degrees of belief, or by any means other than the dynamical evolution of the thermal state.

This dynamic probability underpins the probabilistic understanding of the second law of thermodynamics. In its migrations, the system is most likely to migrate to, and be found in, the most probable state. Reading directly from Boltzmann’s formula, this most probable state has the greatest thermodynamic entropy. This leads to a familiar probabilistic form of the second law: systems spontaneously move to states of maximum thermodynamic entropy—where they remain, most probably. If a uniform distribution of this dynamic probability is assumed over the system’s phase space, this same result is expressed as the near-certain evolution of the system to states associated with the largest volume of phase space. Thermodynamic entropy is thus associated with the logarithm of volumes of phase space by Boltzmann’s formula. Entropy increases and decreases as the volume of phase space associated with a state increases and decreases.

According to the Clausius definition, the entropy of Boltzmann’s formula is only associated with heat when the system has arrived at its most probable state, thermal equilibrium. The Clausius definition is expressed in terms of thermodynamically reversible processes that are only realized in systems brought arbitrarily close to thermal equilibrium. The system’s probability distribution over its phase space is then Boltzmann, or equivalently, canonical.

Misattribution of Thermodynamic Entropy

Landauer’s analysis was based on the assumption that the process of erasure reduces the number of states in a memory device, representing a compression of the device’s phase space and leading to a reduction in thermodynamic entropy. The error here should be obvious. Prior to erasure, the memory device is in just one of its 2n states; after erasure, it is in another. The logical specification of the erasure process does not require any compression of the phase space, only a relocation of which part is occupied.

The need to compress phase space is a consequence of the erasure protocol employed by Landauer, and nearly all the authors who followed him. Among them, there is an insistence on a dissipative erasure procedure in which the memory device is thermalized. The energetic barriers that prevent each bit from flipping are dropped. As a result, the bits can flip to and fro in thermal agitation, and the memory device can migrate freely over all its 2n states. This irreversible process is responsible for the thermodynamic entropy that appears in subsequent calculations.

Erasure is portrayed as moving the system from a state of probability W = 1/2n to W = 1. This probability ratio is not a ratio of dynamic probabilities: the unerased memory device is not migrating over all possible 2n states. The probability W = 1/2n assigned to the unerased memory device does not correspond with relative occupation times. If this were the case, the unerased device would be useless for storing data. The probability W = 1/2n is derived from another source. If each of the possible states is deemed equally likely, or it is assumed that each configuration will be encountered equally often during typical computation, these are nondynamic probabilities. They cannot be substituted into Boltzmann’s formula if a thermodynamic entropy is to be recovered. Dynamic probabilities appear as an intermediate when the memory device is thermalized so that it can migrate freely over all its 2n states. This thermodynamically irreversible step is responsible for the creation of nk log 2 of entropy. It does not arise from the logic of erasure, but from a step in the particular erasure procedure employed.

This treatment of the memory state as if it were the thermalized state is pervasive among researchers. In response to earlier analyses and articles, I had been assured that more recent demonstrations of Landauer’s principle avoid this conflation. This is not the case. Examination of these purported improvements show that the conflation remains the basis of virtually all demonstrations.4 It has only become harder to see, since it is buried ever more deeply in a growing thicket of formalism.5

Unavoidable Fluctuations

Thermal fluctuations are necessitated by the dynamic probabilities in Boltzmann’s formula. Consider an ideal gas, consisting of n molecules, that is momentarily confined to half a vessel; the gas will most probably expand to fill the vessel. Since each molecule moves independently, the ratio of probabilities of the final expanded state to the initial state is W = 2n. Boltzmann’s formula assigns an entropy change of S = k ln 2n = nk ln 2 to this twofold expansion in volume, matching the expression from the ordinary thermodynamics of ideal gases.6 The dynamical character of the probabilities permits a reversal of this expansion. The probability of all the molecules being momentarily located in the original half of the vessel is just 1/2n. This spontaneous recompression of the gas would represent an improbable thermal fluctuation.

For macroscopic systems, these fluctuations are imperceptible. In systems with smaller numbers of components, where n is small, the probabilities for fluctuations are substantial and will reverse processes that would easily reach completion in large-n systems. The twofold expansion of a three-molecule ideal gas is reversed by a fluctuation with probability (1/2)3 = 1/8. Such a reversal will occur routinely.

This result for small-n systems is quite general and is recovered most easily from Einstein’s fluctuation formula.7 Consider a system that has come to thermal equilibrium with a large heat reservoir at temperature T. The system’s energy E is canonically distributed. If <…> designates expectation values, the variance of the energy is related to the mean energy <E> by

<ε2> = kT2 d<E>/dT,

where ε = E – <E> is the deviation of the energy from its mean value. Most of the systems commonly considered in thermal physics have Hamiltonians that are quadratic in their canonical phase space variables. The equipartition theorem applies to such systems. Each has an additive contribution of kT/2 to its mean energy for each of its m degrees of freedom. The mean energy is

<E> = mkT/2.

From Einstein’s fluctuation formula it follows that the spread in the energy, as measured by the root mean square (RMS) deviation, is

<ε2>1/2 = (m/2)1/2 kT.

The spread in energy grows slowly with the square root of the number of degrees of freedom m, whereas the mean energy <E> grows faster, linearly in m. For macroscopic systems with large values of m, fluctuations become negligible. In these systems, the mean energy is on the order of 1024 in units of kT. Energy fluctuations, on the other hand, are only on the order of 1012 in energy units of kT.

For systems with small numbers of components, energy fluctuations in relation to the system’s mean energy will be substantial. A monatomic ideal gas with three degrees of freedom is often used in a simple model of a one-bit memory device. The spread in its energy is given by

<ε2>1/2 = (3/2)1/2 kT = 1.22kT.

It follows that the gas energy fluctuates over an RMS range of 0.28kT to 2.72kT. Such fluctuations are substantial and will impede the completion of processes. Suppose that a gas is heated to twice its initial temperature in order to double its energy. A fluctuation reverting the energy to its original level is well within this RMS range. The energy increase will be spontaneously undone and redone, repeatedly, by fluctuations.

No-Go Result

The previous section has recounted two instances in which thermal fluctuations disrupt the completion of processes in systems with small numbers of components. This disruption is part of a general no-go result that applies to all such systems.8 The process should be two things: minimally dissipative, that is, creating the minimum of thermodynamic entropy; and complete with certainty, or at least high probability. Satisfaction of one of these conditions precludes satisfaction of the other. Such is the no-go result. When seeking minimum thermodynamic entropy creation, any workable probability of completion must be eschewed. If, however, substantial probabilities of completion are being sought, it is necessary to create thermodynamic entropy in large quantities on molecular scales.

Since the n-bit memory devices used by Landauer are small-n systems, this tension applies to all the processes involved in his proposal. No process at this scale can be brought to completion with high probability unless there are dissipative, entropy-creating processes somewhere in the system. Although necessary, their presence is routinely neglected by researchers, along with the associated entropy creation. In the standard erasure protocol, it is assumed that the state space of a thermalized n-bit memory device can be compressed reversibly without the creation of thermodynamic entropy.

The basic idea of the no-go result is recoverable without computation from Boltzmann’s formula. Consider some process that proceeds from an initial state init to a final state fin, where the process moves forward by virtue of the dynamics of the system. In order to minimize entropy creation, all processes must be kept as close as possible to thermodynamic reversibility. In the limit case, a process is sought whose initial entropy Sinit and final entropy Sfin are equal:

Sinit = Sfin.

It follows from Boltzmann’s formula that the dynamic probabilities of the two states are the same:

Winit = Wfin.

Since these probabilities are dynamic, they describe fluctuations. These fluctuations so confound a constant entropy process that it is as likely to be found in its initial state as in its final state.

For any small-component process, computational or otherwise, its completion can only be assured probabilistically by raising the entropy of the final state in relation to the initial state. The resulting entropy costs are substantial. Securing a modest ratio of success, such as (Wfin/Winit) = 20, requires a process that creates at least 3k of entropy:

ΔS = k ln (Wfin/Winit) = k ln 20 = 3k.

In this formula, the quantities of entropy required to suppress fluctuations are large in comparison to those tracked by Landauer’s principle. They arise from the same relation S = k ln W that is essential to Landauer’s proposal, and they cannot be dismissed as an inconvenience to be dealt with in other ways.

These quantities of entropy are independent of the logical specification of whichever computation is implemented. They are determined merely by the probability of successful completion. If, as is commonly the case, a computation requires multiple steps to be completed successfully, there will be a corresponding quantity of entropy associated with the completion of each step. An attainable lower limit to dissipation in molecular scale processes is neither given by Landauer’s expression, nor is it independent of the implementation of the computation. Any estimate must include the thermodynamic entropy created to assure completion with the specified probability of each step of the implementation used.

A More Developed No-Go Result

The no-go result described in the previous section did not account for the intermediate between states in the process. Their inclusion reveals additional sources of thermodynamic entropy derived from thermodynamically reversible processes on molecular scales.

Consider a process where the degree of completion is tracked by a continuous variable λ. The process could be an expansion or contraction of the accessible volume of a system’s configuration space, a measurement process in which the state of a measuring device is made to match that of a target system, a transfer of data from one memory device to another, the setting of the content of one memory device as a specified function of another device, or any other process for a computational system that has a definite initial and final state. Ascertaining the minimum dissipation involves finding a thermodynamically reversible process in which the thermodynamic entropy S of the system and its environment is close to constant:

dS(λ)/dλ = 0 and thus S(λinit) = S(λ1) = S(λ2) = S(λfin).

Applying Boltzmann’s formula yields a uniform probability density w over λ:

w(λinit) = w(λ1) = w(λ2) = w(λfin).

Since these probabilities are dynamic, it follows that the system is equally likely to be found in the initial and final states, or any arbitrarily chosen intermediate stages. The system fluctuates with limiting occupation times matching these probabilities. Attempting to implement a thermodynamically reversible process leads only to a process so thoroughly confounded by fluctuations that it could equally be in any of its stages.

Suppose that the stages are divided into n steps—λ = 0 to 1, λ = 1 to 2, …, λ = n–1 to n—and that the system is initialized in a state corresponding to a range of values, λ = 0 to 1. Suppose also that the system is then allowed to evolve dynamically over the full range of stages. The final state eq is not the intended final state λ = n – 1 to n. Rather, because all intermediate stages are accessible, it is a state uniformly distributed over all the stages λ = 0 to n. The ratio of probabilities is Weq/Winit = n/1. Thermodynamic entropy creation is given by

ΔS = k ln (Weq/Winit) = k ln n.

This entropy is created without any assurance that stages of larger λ have greater probability. The probability of the intended final state, λ = n – 1 to n, is simply

Wfin = 1/n.

To improve the probability of successful completion, the process must be designed so that its later stages have higher thermodynamic entropy:

dS(λ)/dλ > 0.

The greater the entropy change over the stages, the higher the probability of the later stages. A simple example will illustrate how such an entropy gradient enhances the probability of completion.

An Illustration

Consider a system in thermal contact with a large heat reservoir at temperature T and a system Hamiltonian given by

H(λ) = f(π) – ελ.

The stages of the process are tracked by the parameter λ, which is assumed to be canonical. The quantity ε ≥ 0 introduces an energy gradient such that the system evolves toward larger values of λ. The remaining canonical coordinates of the system are represented by π and their contribution to the Hamiltonian by the term f(π), which is independent of the stage of completion. They need not be represented more completely, since they will drop out of the calculation.9

The probability that the system is between two stages λ1 and λ2 is proportional to the partition integral:

Zλ1,λ2=λ1,πλ2exp-HλkTdλdπ=const.λ1λ2expελkTdλ

=const.kTεexpελ2kT-expελ1kT.

This probability is no longer uniform and, for larger ε, favors larger λ. If the intended final state is λ = n – 1 to n, its probability is

Wfin=Zn-1,nZ0,1=expεn/kT-expεn-1/kTexpεn/kT-1

=1-exp-ε/kT1-exp-εn/kT.

The thermodynamic entropy created by the process that released the system from its initial state λ = 0 to 1 is

ΔS=klnZ0,nZ0,1=klnexpεn/kT-1expε/kT-1.

These latter two formulas have two revealing limiting cases. If the limit is taken as ε goes to zero, there is no driving force and a zero-energy gradient:

Wfin = 1/n and ΔS = k ln n.

This leads to the entropy creation of the more developed no-go result, but with an unsatisfactory probability of success. For large ε in which the process is driven by a steep energy gradient,

Wfin ≈ 1 – exp(–ε/kT) and ΔSε(n – 1)/T.

The probability of successful completion can be brought as close to one as needed by making ε sufficiently large. There are, however, large quantities of entropy created in proportion to (n – 1). This entropy can be interpreted in terms of the Clausius definition. It is the entropy created by a reversible transfer of heat to the heat reservoir amounting to ε(n – 1). This heat corresponds to the energy lost by the system when moving from a stage λ = 1 to λ = n.

Lower Dissipation

The accessibility of intermediate stages in a process creates more thermodynamic entropy than called for by the Boltzmann formula ΔS = k ln (Wfin/Winit). If the attempt to keep all processes thermodynamically reversible is abandoned, it turns out that fluctuations can be used to arrive at this lesser amount of entropy creation.

Lower dissipation is achieved by ensuring that all intermediate states inter have energies far higher than those of the initial and final states, init and fin:

Einter >> Einit     Einter >> Efin.

If the system is in thermal equilibrium with a heat reservoir at temperature T, the probability of a state is canonically distributed. The probability of intermediate states is then proportional to exp(–Einter/kT). This will be significantly less than the corresponding factors for the probabilities of either the initial or final states,
exp(–Einit/kT) and exp(­–Efin/kT).

The process only advances when an improbable fluctuation leads the system to jump from the initial state to an intermediate state of much higher energy. The system can then revert to either the initial or the final state. If the factor exp(–Einter/kT) is negligible, the probability that the process over the longer term delivers the system in the final state is

Wfin=exp-Efin/kTexp-Efin/kT+exp-Einit/kT

=11+expEfin-Einit/kT.

Thermodynamic entropy is created in the transition from the initial state to a state that is the probabilistic weighted combination of the initial state and the final state:

ΔS=klnexp-Einit/kT+exp-Efin/kTexp-Einit/kT

=kln1+exp-Efin-Einit/kT.

The force driving the process is the energy difference Efin – Einit. As was the case with the example discussed in the previous section, there are two limiting cases. When the energy-driving force goes to zero, Efin – Einit = 0:

Wfin = 1/2 and S = k ln 2.

This represents an improvement on the results for the previous example. The probability of successful completion has increased from 1/n to 1/2 and entropy creation has been reduced from k ln n to k ln 2. When the energy-driving force is large, Efin – Einit << 0:

Wfin ≈ 1 – exp((Efin – Einit)/kT) and ΔS ≈ –(Efin – Einit)/T.

Once again, the probability of successful completion, Wfin, can be brought as close to one as needed by making Efin – Einit sufficiently negative, although the entropy created rises correspondingly. From the Clausius definition, this is the thermodynamic entropy that would be gained by the thermal reservoir if the energy lost by the system – (Efin – Einit) were imparted to the reservoir as heat in a thermodynamically reversible process.

A comparison with the formulas provided in the previous section in relation to a large driving force shows a reduced level of entropy creation for the same probability of completion. For any desired value of Wfin, the process outlined in the earlier section with accessible intermediate states requires creation of n – 1 times as much entropy as the process in the preceding section with inaccessible intermediate states.

Rendering the intermediate states energetically inaccessible may be appealing due to the ensuing reduction in thermodynamic entropy creation, but it leads to another problem. The probabilities computed here are dynamic and correspond to the relative occupation times of the system. Since the process must still pass through the intermediate states and they have very low probability, the process will take a long time to complete as the system waits for a highly improbable random fluctuation.

Thermodynamic Reversibility

The issues outlined in the preceding sections are the principal difficulties arising from Landauer’s proposal. Seeing them clearly has nonetheless proven difficult because they are obscured by further layers of confusion. An important example concerns thermodynamically reversible processes. Formulating a precise characterization for these processes is challenging.10 Generally speaking, a thermodynamically reversible process proceeds with minute deviations from equilibrium. Within those minute deviations, it can proceed in either direction.

The most troublesome source of confusion in the literature concerning the Landauer principle is the erroneous claim that an irreversible thermalization process is, in fact, thermodynamically reversible.11 Prior to thermalization, it is often argued, there is a probability of 1/2n that the memory device is in each of the possible 2n states. After thermalization, the same probabilities are obtained. From the Boltzmann formula, it follows, or so the argument goes, that the entropy of the memory device is unchanged during thermalization and the process is thermodynamically reversible. Of course, the probability prior to thermalization is not a dynamic probability, and Boltzmann’s formula cannot be applied.

This confusion threatens to undo the claim that a logically irreversible process, such as erasure, must be implemented by a thermodynamically irreversible process. If these erroneous claims are accepted, the erasure process can proceed entirely with thermodynamically reversible steps. To erase a memory device, it would first need to be thermalized, which is already a logically irreversible process. Its state space would then be compressed to a single state. This can be carried out in a manner analogous to the thermodynamically reversible compression of a gas. The compression requires that thermodynamic entropy is passed to the environment and that, following Clausius’s definition, the environment be heated. The heating effect that grounded Landauer’s original paper is recovered, but now using a process that is supposedly thermodynamically reversible.

Closure

Each of the propositions arising from Landauer’s original paper turns out to be either unfounded or refutable.

  • Contrary to the first and second propositions, the need to suppress fluctuations imposes a lower limit on entropy creation that is unconnected to the logic of the computation. The limit is set by the number of steps involved in the computation and the probability of completion for each.
  • The third proposition is contradicted by the literature in relation to the Landauer principle and the assertion that the logically irreversible thermalization of a memory device is thermodynamically reversible.
  • The fourth proposition fails since erasure only relocates the occupied location of a phase space and does not compress it.
  • The fifth proposition mixes nondynamic and dynamic probabilities. Inserting them into Boltzmann’s formula as part of the sixth proposition does not yield a thermodynamic entropy.
  • The seventh proposition fails with respect to recovered quantities of heat because the entropy changes computed in the sixth proposition are not thermodynamic entropies subject to the Clausius definition.
Endmark

  1. Rolf Landauer, “Irreversibility and Heat Generation in the Computing Process,” IBM Journal of Research and Development 5 (1961): 183–91. 
  2. Charles Bennett, “The Thermodynamics of Computation—A Review,” International Journal of Theoretical Physics 21, no. 12 (1982): 905–40. 
  3. For helpful surveys of this literature and for its connection with the literature on Maxwell’s demon, see Harvey Leff and Andrew Rex, Maxwell’s Demon 2: Entropy, Classical and Quantum Information, Computing (Philadelphia: Institute of Physics Publishing, 2003), and Owen Maroney, “Information Processing and Thermodynamic Entropy,” The Stanford Encyclopedia of Philosophy (September 2009), ed. Edward Zalta. 
  4. See the appendix in John Norton, “Waiting for Landauer,” Studies in History and Philosophy of Modern Physics 42 (2011): 196–98. 
  5. A welcome exception is the demonstration provided by James Ladyman et al., “The Connection between Logical and Thermodynamic Irreversibility,” Studies in the History and Philosophy of Modern Physics 38 (2007): 58–79; James Ladyman, Stuart Presnell, and Anthony Short, “The Use of the Information-Theoretic Entropy in Thermodynamics,” Studies in History and Philosophy of Modern Physics 39 (2008): 315­–24; James Ladyman and Katie Robertson, “Landauer Defended: Reply to Norton,” Studies in History and Philosophy of Modern Physics 44 (2013): 263–71. The problems of these demonstrations have been laid out in John Norton, “Waiting for Landauer,” Studies in History and Philosophy of Modern Physics 42 (2011): 184–98, and John Norton, “Author’s Reply to Landauer Defended,” Studies in History and Philosophy of Modern Physics 44 (2013): 272. 
  6. This example is from Albert Einstein’s light quantum paper in which he memorably labels formula (1) “Boltzmann’s principle.” Albert Einstein, “Über einen die Erzeugung and Verwandlung des Lichtes betreffenden heuristischen Gesichtspunkt,” Annalen der Physik 17 (1905): 132–48. 
  7. Albert Einstein, “Zur allgemeinen molekularen Theorie der Wärme,” Annalen der Physik 14 (1904): 360. 
  8. This no-go result has been developed in John Norton, “Waiting for Landauer,” Studies in History and Philosophy of Modern Physics 42 (2011): §7; John Norton, “All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation,” Entropy 15 (2013): Part II; John Norton, “The End of the Thermodynamics of Computation: A No-Go Result,” Philosophy of Science 80 (2013): 1,182–92; and John Norton, “Thermodynamically Reversible Processes in Statistical Physics,” American Journal of Physics 85: 135–45. These papers contain computations of fluctuations in specific processes, including the expansion of ideal gases of few and many molecules, and the measuring of the state of an electric dipole. 
  9. For two instantiations of systems with this Hamiltonian, see John Norton, “All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation,” Entropy 15 (2013): §10, §11. The first is a small bead that slides frictionlessly on a straight wire. The inclination of the wire to the horizontal yields a gravitationally induced energy gradient that drives the bead from one end of the wire to the other. The second is a charge that we seek to move in a channel. The energy gradient that moves the charge is provided by a constant electric field. 
  10. John Norton, “The Impossible Process: Thermodynamic Reversibility,” Studies in History and Philosophy of Modern Physics 55 (2016): 43–61. 
  11. This is reported as a standard result in Harvey Leff and Andrew Rex, Maxwell’s Demon 2: Entropy, Classical and Quantum Information, Computing (Philadelphia: Institute of Physics Publishing, 2003), 21. 

John Norton is Distinguished Professor in the Department of History and Philosophy of Science at the University of Pittsburgh.


More on Physics


Endmark

Copyright © Inference 2024

ISSN #2576–4403