by baumsm2a

Temperature exists. While this may seem like an incredibly bold assertion, it’s really very easy to prove.

Like all proofs, this requires a bit of setup: in our case, a few definitions and an axiom. Let’s start with the definitions.

microstate of a physical system is a complete specification of all the parameters of the system–in a classical gas, this would be positions and momenta of every particle; in a quantum system, it would be the wavefunction or quantum state; etc.

macrostate of a physical system is a complete specification of everything we can measure about it–pressure, volume, surface tension, etc.

The fundamental postulate of statistical mechanics says that a system is found with equal probability in each of its accessible microstates, where “accessible” means “consistent with the given macrostate.” In classical systems the state space is continuous and we thus have to turn the probability into a probability density, but this can be easily worked around by dividing the space up into units of h.

On to the proof. Imagine a system with two subsystems, which we will imaginatively call 1 and 2, and which collectively share some energy E. Let us go further, and say that we cannot measure the energy of 1 and 2 directly, only their total, so that E completely specifies a macrostate. 1 and 2 are allowed to exchange thermal energy, but not particles, and we allow them to do so until they are in equilibrium and nothing is changing anymore.  The system has to be in some microstate, so 1 has to have some energy E_1 and 2 has to have some energy E - E_1. Each subsystem also has to have some function that specifies how many microstates are available to it with its energy, which we will call \Omega_1 and \Omega_2. The total number of microstates available to the whole system is thus \int_0^E\Omega_1(E_1)\Omega_2(E-E_1)\, dE_1.

The fundamental postulate of statistical mechanics says that we will find 1+2 in each of these microstates with equal probability. It is thus very probable that we will find the energies of 1 and 2 to be the E_1 and E_2 := E-E_1 that maximize $latex \Omega_1(E_1)\Omega_2(E_2)$. (In practice, this function has such a sharp peak that it’s basically certain you’ll find those energies, but proving that is considerably trickier.) This is the energy partition where the derivative of this expression with respect to E_1 is 0, i.e.

\frac{d\Omega_1}{dE_1}\Omega_2 - \Omega_1\frac{d\Omega_2}{dE_2} = 0.

Rearranging gives us

\frac{1}{\Omega_1}\frac{d\Omega_1}{dE_1} = \frac{1}{\Omega_2}\frac{d\Omega_2}{dE_2}.

Astute readers will recognize these expressions as derivatives of logarithms:

\frac{d\log\Omega_1}{dE_1} = \frac{d\log\Omega_2}{dE_2}.

We have, then, identified a function of a system, depending only on the state of that system, and having the happy advantage of being measurable, which is the same across any two (and thus any N) systems in thermal equilibrium.  It does not seem too unreasonable to call such a function “temperature.”

As a bonus, we’ve also defined entropy along the way, as the logarithm of the number of microstates of a system. (oh, sure, boltzmann’s constant is in there too, but it’s just a conversion factor.) Temperature can then be defined as the derivative of entropy with respect to energy.

Enterprising readers with a bit of leisure time may want to show for themselves, as a diverting and illuminating exercise, that this definition of entropy makes sense, i.e., that it is extensive and obeys the 2nd law of thermodynamics.

Edit: a reader pointed out that this doesn’t define temperature, it defines “thermodynamic beta,” which is \frac{1}{kT}And they are right. But beta is a more fundamental quantity, and hey, what’s a coordinate transformation between friends?