The book explicates the concept of entropy, particularly its governance of all of thermal physics, over a broad range of equilibrium and nonequilibrium phenomena. Historical development and modern research are presented in the context of entropy as a fundamental element of probability theory and its relation to the notion of information. read more...
This book is based on the premise that the entropy concept, a fundamental element of probability theory as logic, governs all of thermal physics, both equilibrium and nonequilibrium. The variational algorithm of J. Willard Gibbs, dating from the 19th Century and extended considerably over the following 100 years, is shown to be the governing feature over the entire range of thermal phenomena, such that only the nature of the macroscopic constraints changes. Beginning with a short history of the development of the entropy concept by Rudolph Clausius and his predecessors, along with the formalization of classical thermodynamics by Gibbs, the first part of the book describes the quest to uncover the meaning of thermodynamic entropy, which leads to its relationship with probability and information as first envisioned by Ludwig Boltzmann. Recognition of entropy first of all as a fundamental element of probability theory in mid-twentieth Century led to deep insights into both statistical mechanics and thermodynamics, the details of which are presented here in several chapters. The later chapters extend these ideas to nonequilibrium statistical mechanics in an unambiguous manner, thereby exhibiting the overall unifying role of the entropy. short..
|Publisher||Oxford University Press|
|Imprint||Oxford University Press|
|Number of Pages||224|