The ontological status of information in physics

Nick Wiggershaus

While the inclusion of information theoretical concepts into (quantum) physics has shown enormous success in recent years, the ontology of information and its connection to entropy remains somewhat puzzling. Often the two terms are simply used interchangeably; conceptually though, it is questionable whether information and entropy are equivalent (or if one entails the other). In this talk, I aim to briefly illuminate the ontological status of information in physics.* I focus on syntactic information measures, i.e. (i) Shannon Information, a concept originally stemming from Communication Theory and (ii) Algorithmic Information (a.k.a. Kolmogorov Complexity), a concept often applied in Computer Science. Shannon Information and Kolmogorov Complexity are linked through Coding Theory and are both – to some extent – considered crucial components in debates about entropy. I argue, that (in the classical case) Shannon Information and Algorithmic Information are both abstract and highly conventional entities. Either ‘kind of information’ is multi-realizable by objects or physical systems. However, what counts as a suitable object or physical system seems to be entirely contingent on the stipulations of the user; both information measures are to a large degree conventional.

*The talk is partly based on the results of my MA-thesis The Ontological Status of Information in Physics I submitted in November 2017 for my HPS degree at Utrecht University.