Entropy and the problem of justification

Peter Lukan
University of Ljubljana, Slovenia

When Gibbs introduced his ensemble distributions in statistical physics, his justification was pragmatic. Years later, Jaynes derived his ensemble distributions using the so called principle of maximum entropy. This seems a remarkable achievement, but the question of foundation still remains. Jaynes uses the concept of information entropy, so one of the ways of looking for an answer is to explore the relationship between the classical concept of entropy and information concept of entropy. This approach further begs the question of how exactly to understand the concept of information. I will try to assess this problem starting my path from the point of view of information theory and involving different understandings of the concept of information relying on different interpretations of probability. Further on I will assess information theory in the context of well-known problems from probability theory. From there I will come back to the classical concept of entropy and articulate the main set of problems. More than giving categorical answers, I will stress critical weaknesses of both the classical and information sides of the story.