====== Entropy, the Second Law and Information Theory ====== **Arieh Ben-Naim**\\ Department of Physical Chemistry\\ The Hebrew University of Jerusalem\\ Jerusalem, Israel\\ <blockquote> **Albert Einstein on Thermodynamics: “It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.”** In 1948 Shannon published the Mathematical theory of communication. In this article Shannon defined a quantity which measure the extent of information, or of missing information, or uncertainty associated with any probability distribution. We shall refer to this quantity as the Shannon measure of information (SMI). Unfortunately, Shannon renamed the SMI as entropy. This has caused a great confusion in applying the concept of entropy. Thus, although the two concepts of entropy and SMI have identical formal structure when defined in terms of probability distributions, they are very different in their range of applicability. In this lecture we start by presenting three definitions of entropy, which are different but equivalent. We then derive the thermodynamic entropy from the SMI. The overall plan of obtaining the entropy of an ideal gas from the SMI consists of four steps:\\ First, we calculate the locational SMI associated with the equilibrium distribution of locations of all the particles in the system.\\ Second, we calculate the velocity SMI associated with the equilibrium distribution of velocities (or momenta) of all the particles.\\ Third, we add a correction term due to the quantum mechanical uncertainty principle.\\ Fourth, we add a correction term due to the fact that the particles are indistinguishable.\\ Once we combine the results of the four steps, we get up to a multiplicative constant, the entropy function of an ideal gas. The same entropy function obtained by Sackur and Tetrode in 1912. Once we get the Entropy from the SMI we can use this entropy as a definition of the entropy. Following this definition, we shall discuss three different interpretations of entropy, which may be referred to as the **“informational interpretations” of entropy**. By “informational interpretation” of entropy, we mean the interpretation based on SMI. Which is different from the interpretation of entropy as a measure on information (see below). There are:\\ 1. Average uncertainty\\ There are numerous authors who claim that entropy is a measure of uncertainty. Uncertainty with respect to what? The entropy is the average uncertainty about the occurrence of all possible micro-states. In a classical description of a system a micro-state is a detailed specification of all positions and velocities of all particles.\\ 2. Average unlikelihood\\ The entropy is the average unlikelihood about the occurrence of all possible micro-states. In a classical description of a system a micro-state is a detailed specification of all positions and velocities of all particles 3. Interpretation as a measure of information This is the trickiest interpretation. First, Entropy is not an average information but a measure of information associated with, or contained in the entire probability distribution of the locations and momenta of all particles at equilibrium. All these correct interpretations are derived from the definition of entropy. In the Second part we discussed different formulations of the Second Law of Thermodynamics. In the third Part we discussed several misinterpretations and misuses of entropy and the Second Law. There are many interpretations (disorder, spreading, freedom etc.), which are used to describe entropy, but all are incorrect. We shall also discuss a few miss applications of entropy, such as for living system or for the entire universe. References\\ Ben-Naim, A. (2008), A Farewell to Entropy: Statistical Thermodynamics Based on Information. World Scientific, Singapore.\\ Ben-Naim, A. (2009), An Informational-Theoretical Formulation of the Second Law of Thermodynamics. J. Chem. Education, 86, 99.\\ Ben-Naim, A. (2010), Discover Entropy and the Second Law of Thermodynamics. A Playful Way of Discovering a Law of Nature. World Scientific, Singapore.\\ Ben-Naim, A. (2015), Information, Entropy, Life and the Universe. What we know and what we do not know. World Scientific, Singapore.\\ Ben-Naim, A. (2017a), Information Theory, Part I: An Introduction to Fundamental Concepts. World Scientific Publishing, Singapore\\ Ben-Naim, A. (2017b), The Four Laws that do not drive the Universe. World Scientific Publishing, Singapore.\\ Ben-Naim, A. (2017c), Entropy the Truth the Whole Truth and Nothing but the Truth, World Scientific Publishing, Singapore\\ Ben-Naim, A. (2018), Time’s Arrow (?) The Timeless Nature of Entropy and the Second Law of Thermodynamics Lulu Publishing Services.\\ Ben-Naim, A. (2019), Entropy for Smart Kids and Their Curious Parents, Cambridge Scholars, UK\\ Ben-Naim, A. and Casadei, D. (2017), Modern Thermodynamics, World Scientific Publishing, Singapore.\\ </blockquote>