Philosophy of Physics Graduate Lunch Seminar (Monday - Week 4, HT19)

Philosophy of Physics

Please note the new title and abstract for this seminar.

What does entropy stand for? The entropy is a central magnitude in statistical thermodynamics: in the microcanonical ensemble, thermodynamic potentials and expectation values of observables such as the temperature are derived from it. However, while the concept of entropy has been rigorously introduced on phenomenological level by Clausius in his mechanical theory of heat, there is up-to-date no consensus on its interpretation. 
Most notably, there are two opposed traditional frameworks on its understanding: Boltzmann’s dynamical account based on a dominant macrostate, and Gibbs’ probabilistic account based on fluctuations in the ensemble of microstates.
A third approach is based on an information-theoretic account of entropy. On that latter, there are, in fact, two distinct perspectives: Jaynes’ school, interpreting statistical mechanics subjectively as a form of inference; and Landauer’s school, suggestive of interpretation of statistical mechanical processes in terms of computational, information-processing procedures.
In this talk we will explore both of these information-theoretic perspectives, and in particular the question whether Gibbs entropy can be obtained as a special case from the Shannon information measure.