AA 2025-26, Vojkan Jaksic
The course is organized in 7 units. The theme of the Part I is the Boltzmann-Gibbs-Shannon (BGS) entropy of a finite probability distribution (p1, … , pn), and its various deformations such as the Rényi entropy, the relative entropy, and the relative Rényi entropy. The BGS entropy and the relative entropy have intuitive and beautiful axiomatic characterizations that will be discussed. The Rényi entropies also have axiomatic characterizations, but those are perhaps less natural, and we shall not discuss them in detail. Instead, we shall motivate the Rényi entropies by the so-called Large Deviation Principle (LDP) in probability theory. The link between the LDP and notions of entropy runs deep and will play a central role in this lecture notes. For this reason Cramér’s theorem is proven right away in the introductory Part 2 (the more involved proof of Sanov’s theorem is given in Part 5). It is precisely this emphasis on the LDP that makes this course somewhat unusual in comparison with other introductory presentations of the information-theoretic entropy. The Fisher entropy and a related topic of parameter estimation are also an important part of this lecture notes. The historical background and most of applications of these topics are in the field of statistics. There is a hope that they may play an important role in study of entropy in non-equilibrium statistical mechanics, and that is the reason for including them in the course.