How "hot" works

When you first heard it applied in the context of painful experience as a child, you likely gained appreciation for the meaning of "hot" without understanding the mechanisms behind it's reputation.  Our job here is to show you that "hot", as bizarre as this may sound, means "low uncertainty slope for energy exchange".  This is an assertion that draws from the wide applicability of such slopes in gambling theory, which predicts for example that conserved quantities whose entropy is a concave function will likely flow from low to high slope systems when given the chance.  When the number of opportunities for random energy exhange are numerous, such predictions are highly accurate, and may be realized on so rapid a time scale that significant energy transfer occurs before your body has time to react and avoid damage.  When energy is the conserved quantity, the uncertainty slope is called reciprocal temperature or coldness.  It approaches infinity at absolute zero, and at room temperature equals 1/kT ~ 40[nats/eV], where nats is a unit of information-uncertainty defined by [Graphics:Images/index_gr_1.gif].  LASERs operate by taking advantage of inverted population (i.e. negative uncertainty slope) states to deliver energy most anywhere.  

Technical Aside: Such "law-zero" uncertainty slopes (which for argon gas at STP also include free expansion coefficient P/kT ~ [Graphics:Images/index_gr_2.gif][nats/cc], and chemical affinity [Graphics:Images/index_gr_3.gif] ~ 16[nats/atom]) all behave like a kind of hunger for (or "incentive to acquire through random sharing") the conserved quantity to which each pertains.  Rates of slope change are of interest also.  In natural units, the familiar no-work heat capacity (degrees freedom over two for quadratic systems) becomes simply (for all b) the uncertainty increase in base-b information units, per b-fold decrease in uncertainty slope.  The corresponding "volume capacity" of an ideal gas is simply the number of particles it contains.  Although microcanonical examples are used here, the uncertainty-slope relations described apply throughout post-Gibbsian statistical physics, and in all other Lagrange-multiplier based applications of entropic priors for statistical inference (of which the uniform, normal, and binomial distributions are special cases).

Some Familiar Molecule Relations

The systems of thermal physics traditionally involve molecules.  Hence we first recall how to convert between molecules N and moles n using the gas constant R = 8.31[J/(mole K)] which is a product of Avogadro's number [Graphics:Images/index_gr_4.gif] and Boltzmann's constant [Graphics:Images/index_gr_5.gif][J/K].

[Graphics:Images/index_gr_6.gif]

In what follows, we will use a quantity k to determine the units used for temperature T.  In the particular case when k is chosen to be [Graphics:Images/index_gr_7.gif], the temperature will be in historical units (e.g. [Kelvins]).  When k=1, or when we equivalently consider the quantity kT rather than T as the temperature, then we will say that temperature is in "natural units".  Below, we show that in natural units temperature may be expressed in Joules (or electron volts) per nat of information uncertainty.

Before examining this in more detail, let's consider a couple of very useful elementary thermodynamic relationships: "equipartition" and "the ideal gas equation of state".

[Graphics:Images/index_gr_8.gif]

This above equation relates extensive quantity E, the amount of randomly-distributed mechanical (kinetic and potential) energy in a gas or solid, to an intensive quantity: its temperature T.  We show later how this relation arises from the equation for a quadratic system's number of accessible states.  Likewise, the equation of state for an ideal gas below follows from the assumption that each molecule in an ideal gas has a volume of V in which to "get lost".

[Graphics:Images/index_gr_9.gif]

The above equation thus relates extensive quantity volume V to intensives: gas temperature T and pressure P.

Using these two equations, show that energy and temperature are quite different by proving to yourself that when you build a fire in an igloo, the total thermal kinetic energy of the air inside is unchanged!  (Hint:  This is true even though the temperature of the air goes up.)

The 0th Law: Gambling with Uncertainty...

To examine the way that thermal physics can give birth to such relations, a useful concept is the multiplicity or "number of accessible states [Graphics:Images/index_gr_10.gif]".  Since for macroscopic systems this is often an unimaginably huge number (on the order of [Graphics:Images/index_gr_11.gif]), one commonly deals with its logarithm the uncertainty or "entropy" S = k Log[[Graphics:Images/index_gr_12.gif]].  S is measured in information units {nats, bits, and J/K} depending on whether k is chosen to be {1, 1/Log[2], or [Graphics:Images/index_gr_13.gif]} respectively.  "Log" here is the natural, not base-10, logarithm.  Knowing the dependence of these functions [Graphics:Images/index_gr_14.gif], and hence S, on any conserved quantity X (like energy, volume, or number of particles) that may be shared randomly between two systems allows one to "guess" how X is likely to end up distributed between the two systems.  One simply chooses that sharing of X which can happen in the largest number of ways, a mathematical exercise which for reasonably well-behaved functions predicts that systems will most likely adopt subsystem X-values for which subsystem uncertainty slopes dS/dX are equal, i.e.  

[Graphics:Images/index_gr_15.gif]

Example 1: Energy Sharing and Equipartition

This simple assertion yields some powerful results.  Consider first the large class of macroscopic systems whose number of accessible states can be classified as "approximately quadratic in thermal energy" so that [Graphics:Images/index_gr_16.gif], where N is the number of molecules and [Graphics:Images/index_gr_17.gif] is the number of degrees freedom, or ways to store thermal energy randomly, per molecule.  These include all ideal gases, metals near room temperature, and many other macroscopic systems at least in some part of their temperature range.  Using c to denote a constant not dependent on energy E, one can then calculate uncertainty S and it's first and second derivatives:  

[Graphics:Images/index_gr_18.gif]

The first derivative says that the energy uncertainty slope of such systems, a quantity predicted to "become the same for all subsystems allowed to equilibrate in thermal contact", is [Graphics:Images/index_gr_19.gif] = [Graphics:Images/index_gr_20.gif].  This quantity [Graphics:Images/index_gr_21.gif] has been historically described as reciprocal temperature, i.e. as 1/T.  One can thus solve this equation for energy to get the equipartition relation above: E = [Graphics:Images/index_gr_22.gif].  The second derivative of uncertainty is negative, so that systems with greater energy have lower uncertainty slope.  As a result, energy flow during thermal equilibration goes from systems of lower to higher uncertainty slope, and equivalently from higher to lower temperature.  This rate of uncertainty increase per unit energy gain (also called "coldness") thus behaves like a kind of hunger for (or as we show below "pressure to acquire") thermal energy.  By comparison, hot objects are like reservoirs of excess thermal energy which has limited room to play.  Hence the energy uncertainty slope (about 40 nat/eV at room temperature, running to infinity as one approaches absolute zero) effectively drives the random flow of heat.  The second derivative calculation above (by taking a square root) also allows one to estimate the size of observed temperature (or energy) fluctuations, but we don't consider that further here.

Example 2: Volume Sharing and Ideal Gases

A system that has a simple volume dependence for the number of accessible states is the ideal gas.  If the gas has sufficiently low densities that gas molecules seldom encounter one another, then the number of places any particular gas molecule may occupy is likely proportional to the volume V to which the gas is confined.  Moreover, the independence of molecules in this low density (ideal gas) case means that the number of accessible states for the gas as a whole is simply proportional to the product of the number of states for each molecule separately, so that [Graphics:Images/index_gr_23.gif].  As above, we can then calculate uncertainty, and its first and second derivatives:

[Graphics:Images/index_gr_24.gif]

Note that the second derivative is again negative, suggesting that volume will spontaneously flow (when it is being randomly shared) from systems of lower to higher uncertainty slope.  But what is the physical meaning of [Graphics:Images/index_gr_25.gif]?  A clue might come from thinking of it as a product of [Graphics:Images/index_gr_26.gif] and [Graphics:Images/index_gr_27.gif].  As we discussed above, [Graphics:Images/index_gr_28.gif] is traditionally written as 1/T, while [Graphics:Images/index_gr_29.gif] is nothing other than a change in energy per unit volume (ala W = PdV) or a pressure P.  In fact, the free expansion coefficient [Graphics:Images/index_gr_30.gif] can be written for ideal gases at equilibrium as P/T.  Equating P/T to uncertainty slope yields the ideal gas equation of state PV=NkT, so that any gas allowed to "share volume" adjusts its density (the number of gas molecules per unit volume) to the equilibrium uncertainty slope, or free expansion coefficient.  This slope, in natural units at standard temperature and pressure, is about [Graphics:Images/index_gr_31.gif] nats/cc at standard temperature and pressure: much less than the atomic density of around [Graphics:Images/index_gr_32.gif] atoms per cc for solids.  The negative 2nd derivative predicts that for systems at the same temperature*, volume will "spontaneously flow" from systems of lower to higher pressure.  Put another way, high pressure systems will expand at the expense of the low pressure neighbors, something that is quite consistent with observation.

* A thermally-insulating barrier between two systems which allows "totally random sharing of volume" is difficult to imagine.  Easy to imagine is a rigid but mobile partition, dividing a closed cylinder into two gas-tight halves.  In this case, gases on opposite sides will adjust P to a common value on both sides of the barrier, thus establishing mechanical (momentum transfer) equilibrium with unequal densities and temperatures.  The higher temperature (lower density) side will then experience fewer, albeit higher energy, collisions.  These will eventually result in thermal equilibration by differential energy transfer, even if we have to think of the wall as a single giant molecule with one degree of freedom, whose own average kinetic energy will "communicate" uncertainty slope differences between sides.

Example 3: Particle Sharing and the Law of Mass Action

The random sharing of particles (for example in a reaction) also gains it's sense of direction from the 0th law of thermodynamics described here.  First, determine how accessible states depends on the number of particles.  Taking derivatives of uncertainty with respect to particle number for an ideal gas, one finds that [Graphics:Images/index_gr_33.gif] (also known as chemical affinity [Graphics:Images/index_gr_34.gif]) approaches [Graphics:Images/index_gr_35.gif], where "quantum concentration" Q is the number of particles per unit volume allowed by thermal limits on particle moment.  Here [Graphics:Images/index_gr_36.gif] is effectively the number of available non-interacting quantum states per particle.  As density n = N/V increases toward Q, affinity α (near 16[nats/particle] for Argon gas at standard temperature and pressure) decreases toward 1, and ideality is lost.  Ratios between [Graphics:Images/index_gr_37.gif] values in gas reactions, for the various components i of a reaction, yield an equilibrium constant that allows one to predict ratios between resulting concentrations [Graphics:Images/index_gr_38.gif].

PostScript

The usefulness of these gambling theory tools for dealing with uncertainty, of course, does not end with the study of physical systems like gases and solids.  They apply to the study of communication lines and data compression, they apply of course to games of chance (one of their first applications), and they apply to more complex physical systems about which we have only partial information (and sometimes mutual information) as well.  Hot is just to most people one of the "earlier-encountered concepts" for whose usefulness these tools account...

UM-StL Physics and Astronomy 2000 © Phil Fraundorf


Converted by Mathematica      November 20, 2002