← Back to Concept Index

reliability-through-redundancy

The principle that adding redundant information to a message can improve transmission reliability in noisy channels, albeit at the cost of reducing the effective transmission rate.

2 chapters across 1 book

Great Ideas in Information Theory, Language and Cybernetics (1966)Jagjit Singh

chapter V RELIABLE TRANSMISSION AND REDUNDANCY SINCE reliability-through-redundancy reduces radically the rate of transmission in a noisy channel, an attempt to make a matching optimal code that combines high reliability with high information flow would seem like eating one's cake and having it too. Neverthe- less, buttressed by the assurance of Shannon's theorem which guaran- tees a way of doing both, even if it does not actually lead to it, attempts have been made to reach the goal in stages. As a first step the enormous complexity of the problem is mitigated by a relaxation of its aim. The problem is no longer a search for optimal codes but those n^ar-optimal ones which secure fairly high reliability by low enough redundancy while still yielding a reasonable rate of trans- mission. For the sake of simplicity, assume that our linguistic medium is basic English, with a total vocabulary of 512 words. Suppose further that the probability of occurrence of all the words in the language is the same. Both assumptions are, of course, not true. The actual basic vocabulary exceeds our chosen number (512) by two to three hundred words and their probabilities of occurrence are far from alike. No matter; we have made these assumptions so as not to encumber our exposition of basic principles with inessential detail. Starting with our vocabulary of 5 1 2 words we may enumerate them successively as 1, 2, 3, . . . , 512, as would be the case if we itemized each word in our basic dictionary. Suppose, again for the sake of sim- plicity, that our channel can transmit only two symbols: a current pulse denoted by and a current gap or a circuit closure denoted by 1 , both of the same duration, say, -^ second. How many symbols must the code assign to each word to be able to encompass all the words in our dictionary? For example, if we assigned two symbols per word, we could have only 2 x 2 = 2^, or four different permutations of 39

This chapter explores the trade-off between reliability and transmission rate in noisy communication channels, emphasizing the role of redundancy in achieving reliable transmission. It introduces the concept of encoding a fixed vocabulary using binary symbols and explains how the number of symbols per word relates to the size of the vocabulary via logarithms. The chapter further discusses the use of parity checks as a simple method to introduce redundancy for error detection and the limitations of single parity checks, leading to the need for more sophisticated error detection and correction methods.

Chapter XII NEURAL NETWORKS- VON NEUMANN ONE way of reorienting neural-network theory towards neuro- physiology is to attenuate the rigidity of some of its underlying assumptions to secure greater conformity to neurophysiological reality. Among other assumptions there is one which is obviously contrary to known facts, namely, the assumed infallibility of neurons. Von Neumann therefore took the next important step of dispensing with the postulated perfection of neural behavior by explicitly incorporating into the network theory the possibility of neural malfunction. That is, he inquired whether a reliable automaton could be designed using unreliable logical elements like neurons which have a certain (statis- tical) probability of failing or misfiring. He proved in his customary elegant way that it is possible to assemble individually unreliable com- ponents into an automaton with any arbitrarily high reliability. This is a result on a level with the remarkable theorem of Shannon on noisy channels described earlier in Chapter IV. Just as there is no a priori reason why it should be possible to transmit information across a noisy channel with arbitrarily high reliability, a result guaranteed by Shannon's theorem, so also it is difficult to visualize how basically unreliable elements could be synthesized to yield auto- mata designed to function with as high degree of reliability as we desire — a result ensured by von Neumann's demonstration. Never- theless, as we shall presently see, it can be achieved in an analogous way, that is, by lavish provision of redundancy. Von Neumann begins by stipulating that every basic universal organ used in the synthesis of any automaton or neuron network has a certain assigned statistical probability {e) of failure or malfunctioning. Although actually this probability of neural misfire is statistically dependent on the general state of the network and on one another, 173

This chapter discusses von Neumann's approach to neural network theory by relaxing the assumption of neuron infallibility and incorporating the possibility of neural malfunction. Von Neumann demonstrated that reliable automata can be constructed from unreliable components by employing redundancy, analogous to Shannon's theorem on noisy communication channels. The chapter explains the probabilistic modeling of neuron failure and the use of multiplexing and majority voting mechanisms to maintain high reliability despite individual component errors.