← Back to Concept Index

information-as-choice

Information is defined as a measure of the freedom of choice when selecting a message from a set of possible messages, regardless of the message's meaning.

1 chapter across 1 book

Great Ideas in Information Theory, Language and Cybernetics (1966)Jagjit Singh

chapter II WHAT IS INFORMATION? TO discover a foothold for a metrical theory designed to serve the purpose described in the last chapter, consider any source of informa- tion. It produces messages by successively selecting discrete symbols from a given stock such as letters of an alphabet, words in a dictionary, notes of a musical scale, colors in a spectrum, or even the mere dash- dot twin of telegraphy. In other words, the message actually trans- mitted is a selection from a set of possible messages formed by sequences of symbols of its own repertoire. The communications system is designed to transmit each possible selection, not merely the one that happened to be actually chosen at the moment of transmission. To take a simple example, consider an ordinary doorbell. It is designed to communicate one or the other of a set of only two possible mes- sages. Either (when pressed) it announces the arrival of a visitor or (when untouched) it denotes the absence of one. In more elaborate systems like the telephone or telegraph the set of possible messages is the aggregate of sequences of words in, say, the English vocabulary. No doubt many of these sequences will be meaningless gibberish without any matter. But here one may repeat with less punning and more literal truth what Locke said in a famous philosophical contro- versy: "No matter; never mind." For the technique of communica- tion process pays no heed to the matter of the messages in question. The physical process of transmission such as the telephone or radio will transmit infantile twaddle as readily as a meaningful saying from the Talmud. Consequently, the metrical theory of information is not concerned with the semantic content of the set of messages from which it selects some particular one for transmission. Because of the need, for our present purposes at least, to steer clear of meaning, "information " in this context is merely a measure of one's 12

The chapter explores the foundational concept of information as a measure of choice among a set of possible messages produced by a communication source, independent of semantic content. It introduces the idea that information can be quantified using logarithms to transform multiplicative combinations of message sets into additive measures, leading to the definition of the bit as a unit of information. The chapter also addresses the limitation of assuming equal probability for all messages and extends the measure to account for differing probabilities, showing how information content decreases as predictability increases.