A twilight zone in natural language: between certainty and confusion

Marc Depuis

The concept of information entropy and surprisal first introduced by Claude Shannon in 1948 can be usefully applied to various disciplines including the study of natural language. Different languages vary widely in terms of the size and content of their phonological inventories and the actual phonetic realisations of their elements by speakers of those languages. A particularly interesting situation arises when words or discrete sounds of language A spoken by a native speaker are perceived and interpreted by native speakers of language B. The presentation will thus focus on entropy and surprisal in the context of cross-language communication. To illustrate the subject and the impact of entropy and surprisal a brief number of perception experiments will be carried out involving the audience.