← Back to Concept Indexchannel-capacity
The maximum theoretical rate at which information can be transmitted over a channel, measured in bits per second.
2 chapters across 1 book
chapter III INFORMATION FLOW OVER DISCRETE CHANNELS HAVING devised a measuring scheme for handling the basic problem of communications engineering, namely, the transmission of informa- tion over any given channel, it is time to proceed to the problem itself This consists simply of evaluating the operating efficiency of a com- munications channel, that is, matching its actual performance against its optimum potential. A natural measure thereof is the ratio of the actual rate of flow of information to its ultimate capacity. But what is a channel over which information flows and what is its capacity ? A channel is any physical medium such as a wire, a cable, a radio or television link, or magnetic tape, whereby we may either transmit information or store it as in a memory device like a tape. The trans- mission and/or storage takes place by a code of symbols which may be pulses of current of varying duration as in telegraphy, light flashes as in navigation, or radio signals of different intensity, polarity, and so forth. Thus, in teletype, signals are transmitted by a code of two symbols made out of the presence or absence of a current pulse for a given duration which is the same for both. These two symbols enable a modern printing telegraph system to transmit any given English text by means of what is commonly called the Baudot code. In this system five impulses are transmitted for every letter, any one of which may be either a current pulse or a gap. That is, in each of the five impulses the circuit is either closed (current present) or open (current absent). With such a code it is possible to obtain 2 x 2 x 2x2x2 = 2^ = 32 different permutations, of which twenty-six are assigned to letters of the alphabet and five to other functions such as space, figure shift, or letter shift, leaving one spare. The five impulses making up the code are sent to the line successively by means of a rotating distributor or commutator and are distributed at the 22This chapter addresses the evaluation of communication channel efficiency by comparing actual information flow rates to the channel's theoretical capacity. It explains the concept of a communication channel as a physical medium transmitting coded symbols, illustrating this with examples like the Baudot code in teletype systems and Morse code in telegraphy. The chapter further explores the mathematical modeling of information transmission over discrete channels, including cases with symbols of equal and unequal durations, and introduces finite difference equations as a tool to handle complexities arising from unequal symbol durations.
chapter IV CODING THEORY IF the capacity of a communications channel is the uhimate limit which the actual rate of information flow can never surpass, how closely may it be attained and by what means ? The answer is pro- vided by two famous theorems of C. E. Shannon about coding — coding because, as will be recalled, a code is to a communications channel what a transformer is to electrical transmission, that is, a means of improving its operating efficiency. Just as to obtain maximum power transfer from a generator to a load a transformer matching the load resistance must in general be introduced, so also there has to be an appropriate code matching the statistical structure of the language used in transmission. That is why it is no mere accident that coding theorems should be so intimately linked with the optimum utilization of channel capacity. Although a source or transmitter may transmit a message in its original form* without coding, even as a grid may transmit power without recourse to a transformer, it has in practice to be enciphered by a code into a form acceptable to the channel. The exigencies of the channel often require coding because it is only in a coded form that the signal can actually travel over it. A case in point is the dash- dot sequence of telegraphy into which a message must be transformed before transmission and from which it must be decoded back into the original text upon arrival. In general, the transmitter or message source feeds the message into an encoder before transmission and the decoder restores it to its original form on emergence from the receiver. Obviously, given a communications system and the type of symbols it can handle, any code constructed with them could be arbitrarily * As for example in ordinary living-room conversation or, for that matter, telephonic conversation wherein the transmitter merely changes the pattern of the audible voice into an equivalent pattern of electrical currents on the telephone wrires. 29This chapter explores the fundamental principles of coding theory as established by Claude E. Shannon, emphasizing the importance of matching codes to the statistical structure of the transmitted language to optimize channel capacity. It illustrates how coding transforms messages into forms suitable for transmission over a channel, using examples such as Morse code and teletype signals, and demonstrates the construction of optimal codes through probability-based partitioning to maximize information flow. The chapter highlights Shannon's theorem guaranteeing the existence of codes that approach channel capacity arbitrarily closely, and it explains the process of assigning shorter codes to more probable messages to achieve efficient, uniquely decipherable communication.