Page:Elementary Principles in Statistical Mechanics (1902).djvu/181

From Wikisource
Jump to navigation Jump to search
This page has been proofread, but needs to be validated.
ON AN ENSEMBLE OF SYSTEMS.
157

of ) the positive values of caused by the second change will be in part superposed on negative values due to the first change, and vice versa.

The disturbance of statistical equilibrium, therefore, produced by a given change in the values of the external coördinates may be very much diminished by dividing the change into two parts separated by a sufficient interval of time, and a sufficient interval of time for this purpose is one in which the phases of the individual systems are entirely unlike the first, so that any individual system is differently affected by the change, although the whole ensemble is affected in nearly the same way. Since there is no limit to the diminution of the disturbance of equilibrium by division of the change in the external coördinates, we may suppose as a general rule that by diminishing the velocity of the changes in the external coördinates, a given change may be made to produce a very small disturbance of statistical equilibrium.

If we write for the value of the average index of probability before the variation of the external coördinates, and for the value after this variation, we shall have in any case

as the simple result of the variation of the external coördinates. This may be compared with the thermodynamic theorem that the entropy of a body cannot be diminished by mechanical (as distinguished from thermal) action.[1]

If we have (approximate) statistical equilibrium between the times and (corresponding to and ), we shall have approximately

which may be compared with the thermodynamic theorem that the entropy of a body is not (sensibly) affected by mechanical action, during which the body is at each instant (sensibly) in a state of thermodynamic equilibrium. Approximate statistical equilibrium may usually be attained
  1. The correspondences to which the reader's attention is called are between and entropy, and between and temperature.