# Statistical Interpretation of Entropy

# Statistical Interpretation of Entropy

As we have seen, thermodynamics is based for the most part on the idea of the conservation of energy (First Law) and the concept of entropy (Second Law). The conservation of energy gives little problem intuitively, but it is quite another story with entropy. Entropy can be considered from the point of view of idealized heat engines operating in cycles, or by deriving some of its inherent properties (Chapter 5). We will see how it is measured and tabulated in Chapter 7. This is all very useful, but doesn’t help much in gaining an intuitive grasp of entropy, such as we have for the other thermodynamic parameters. Just what is entropy, anyway? There may not be any definitive short answer to this question. If we had to rely on classical thermodynamics for an answer, we would talk at some length about the availability of energy, e.g., the fact that in spite of the tremendous quantity of energy in the ocean, we cannot use any of it to power a ship or to do anything else; the ocean’s thermal energy is unavailable unless we provide a reservoir for heat at a lower temperature. This is of course perfectly true, and many useful discussions of the meaning of entropy follow this line of thought, but somehow after all these discussions, the entropy remains somewhat elusive. There is, however, another way to think of entropy that is by far the most useful, and that is from the statistical/probability point of view. This requires that we consider matter from the point of view of the individual particles (atoms, molecules, ions) rather than as macroscopic, homogeneous bodies, and is therefore not a part of classical thermodynamics, but of statistical mechanics. In this chapter we present the rudiments of this approach, not so that the reader can become proficient in statistical thermodynamics (a considerably more thorough introduction is required for that) but to show how entropy is related to statistical considerations. Statistical mechanics does not exactly explain what entropy is, but rather provides a model, quite different from the thermodynamic model, that contains a parameter identical to the entropy of the thermodynamic model in every measurable respect.

*Keywords:*
Boltzmann distribution, configurational entropy of mixing, partition function, thermodynamic probability

Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs , and if you can't find the answer there, please contact us .