Item #2806 Communication / Information Theory: A Collection. CLAUDE SHANNON, HARRY NYQUIST, RALPH HARTLEY, NORBERT WIENER.
Communication / Information Theory: A Collection
Communication / Information Theory: A Collection
Communication / Information Theory: A Collection
View the Collection

PDF List

Communication / Information Theory: A Collection

A remarkably complete collection of works documenting the history of the theory of communication of information – what ‘information’ actually is, and what are the theoretical restrictions on the accurate transmission of information from source to receiver..

Note: The numbers in brackets correspond to the titles listed in the accompanying pdf, accessible via the link below the images.

The first group of works details the development and proof of what is now called the ‘Nyquist-Shannon sampling theorem’. If an analog signal (e.g., voice or music) has to be converted to a digital signal, consisting of binary zeros and ones (‘bits’), the theorem states that a sample of twice the highest signal frequency rate captures the signal perfectly thereby making it possible to reconstruct the original signal. This theorem laid the foundation for many advances in telecommunications.

The first evidence for the sampling theorem was found experimentally by Miner in 1903 [8]. It was formally proposed by Nyquist in 1924 [9, 10] and by Küpfmüller in 1928 [8], but first proved by Nyquist [12] and later by Küpfmüller’s student Raabe [8]. In 1941, Bennett [15] referred to Raabe’s work and generalized it. A result equivalent to the sampling theorem had, however, been proved by Whittaker as early as 1915 [8, 14] in the context of interpolation theory. Finally, in 1948 Shannon [8, 19] published a proof of both the sampling theorem and the interpolation formula as one part of his broader development of information theory.

The term ‘information’, as a precise concept susceptible of measurement, was coined by Hartley in 1928 [11]. “Hartley distinguished between meaning and information. The latter he defined as the number of possible messages independent of whether they are meaningful. He used this definition of information to give a logarithmic law for the transmission of information in discrete messages … Hartley had arrived at many of the most important ideas of the mathematical theory of communication: the difference between information and meaning, information as a physical quantity, the logarithmic rule for transmission of information, and the concept of noise as an impediment in the transmission of information” (Origins of Cyberspace 316).

In the following year, the physicist Szilard established the connection between information and the thermodynamic quantity ‘entropy’. “Szilard described a theoretical model that served both as a heat engine and an information engine, establishing the relationship between information (manipulation and transmission of bits) and thermodynamics (manipulation and transfer of energy and entropy). He was one of the first to show that ‘Nature seems to talk in terms of information’” (Seife, Decoding the Universe, 2007, p. 77).

Another physicist, Gabor, pointed out the relation between the sampling theorem and the uncertainty principle in quantum mechanics [16]: “Signals do not have arbitrarily precise time and frequency localization. It doesn’t matter how you compute a spectrum, if you want time information, you must pay for it with frequency information. Specifically, the product of time uncertainty and frequency uncertainty must be at least 1/4π.”

In 1942 Wiener issued a classified memorandum (published in 1949 [23]) which combining ideas from statistics and time-series analysis, and used Gauss’s method of shaping the characteristic of a detector to allow for the maximal recognition of signals in the presence of noise. This method came to be known as the ‘Wiener filter’. In his Mathematical Theory of Communication (1948) [19], Shannon notes: “Communication theory is heavily indebted to Wiener for much of its basic philosophy and theory. His classic NDRC report ‘The Interpolation, Extrapolation, and Smoothing of Stationary Time Series’, to appear soon in book form, contains the first clear-cut formulation of communication theory as a statistical problem, the study of operations on time series.” Many of the developments in communications theory up to 1948 were summarized and systematized in Weiner’s famous book on cybernetics [17].

It is this work of Shannon’s that represents the real birth of modern information theory. “Claude Shannon's creation in the 1940s of the subject of information theory is one of the great intellectual achievements of the twentieth century” (Sloane & Wyner, Claude Elwood Shannon Collected Papers, 1993, p. 3). “Probably no single work in this century has more profoundly altered man's understanding of communication than C. E. Shannon’s article, ‘A mathematical theory of communication’, first published in 1948” (Slepian, Key papers in the development of information theory, 1974). “Th[is] paper gave rise to ‘information theory’, which includes metaphorical applications in very different disciplines, ranging from biology to linguistics via thermodynamics or quantum physics on the one hand, and a technical discipline of mathematical essence, based on crucial concepts like that of channel capacity, on the other … The 1948 paper rapidly became very famous; it was published one year later as a book, with a postscript by Warren Weaver regarding the semantic aspects of information” (DSB).

“The revolutionary elements of Shannon's contribution were the invention of the source-encoder-channel-decoder-destination model, and the elegant and remarkably general solution of the fundamental problems which he was able to pose in terms of this model. Particularly significant is the demonstration of the power of coding with delay in a communication system, the separation of the source and channel coding problems, and the establishment of fundamental natural limits on communication.

“Shannon created several original mathematical concepts. Primary among these is the notion of the ‘entropy’ of a random variable (and by extension of a random sequence), the ‘mutual information’ between two random variables or sequences, and an algebra that relates these quantities and their derivatives. He also achieved a spectacular success with his technique of random coding, in which he showed that an encoder chosen at random from the universe of possible encoders will, with high probability, give essentially optimal performance” (Sloane & Wyner, pp. 3-4). In [26], Shannon famously calculated the entropy rate of English text to be between 1.0 and 1.5 bits per letter, or as low as 0.6 to 1.3 bits per letter.

Perhaps the crown jewel of Shannon’s paper [19] was the ‘noisy channel coding theorem’ (Theorem 11). This states that for any source whose entropy per second is not too large, it is possible to process (encode) that source at the channel input, and to process (decode) the received signal at the output, in such a way that the error rate (in source symbol errors per second) is as small as desired.

Shannon’s first subsequent paper was ‘The Philosophy of PCM’ [18], with co-authors Oliver and Pierce. In typical large communication systems, a message must travel through many links before reaching its destination. If the message is analog, then a little noise is added on each link, so the message continually degrades. In a digital system, however, ‘regenerative repeaters’ at the end of each link can make decisions on the discrete transmitted signals and forward a noise free reconstructed version, subject to a small probability of error – this was ‘pulse-code modulation’. The enduring message of this paper is that digital transmission has a major advantage over analog transmission in faithful reproduction of the source when communication is over multiple-link paths.

The focus of Shannon’s next paper, ‘Communication in the presence of noise’ [8] is
on the transmission of continuous-time (or ‘waveform’) sources over continuous time channels. Using the sampling theorem, Shannon shows how waveform signals can be represented by vectors in finite-dimensional Euclidean space. He then exploits this representation to establish important facts concerning the communication of a waveform source over a waveform channel in the presence of waveform noise. In particular, he gives a geometric proof of the theorem that establishes the famous formula W log (1 + S) for the capacity of a channel with bandwidth W, additive thermal (i.e., Gaussian) noise, and signal-to-noise ratio S. Similar results were obtained at about the same time by Tuller [2] and Fano [21].

Shannon made an application of the ideas in ‘A mathematical theory of communication’ in ‘Memory requirements in a telephone exchange’ [25]. In operation, such an exchange must ‘remember’ which subscribers are connected together until the corresponding calls are completed. This requires a certain amount of internal memory, depending on the number of subscribers, the maximum calling rate, etc. Too much memory would mean that the cost of new telephone equipment would become untenable; too little and subscribers might get disconnected or not correctly billed.

Shannon’s next major contributions to information theory came in the mid-1950s, when coding theory was still in its infancy, and no one had much sense of whether the coding theorem proved in ‘A mathematical theory of communication’ was simply a mathematical curiosity, or would someday transform communications practice. Coding theorists had attempted to find the best codes as a function of block length, but without success except in a few very special cases. Information theorists therefore began to seek upper and lower bounds on error probability as functions of block length. The first such result [30] showed that error probability decreases exponentially with block length in certain cases, but was not explicit about the exponent.

The first of Shannon’s contributions to this field is ‘The zero-error capacity of a noisy channel’ [31], which determined the least upper bound of rates at which it is possible to transmit information with zero probability of error. When no errors at all are permitted, the probabilistic aspects of channel coding disappear, and only graph- theoretic aspects remain. It was also shown that feedback from receiver to transmitter can increase the zero-error capacity of memoryless channels. The second is ‘Certain results in coding theory for noisy channels’ [32], which showed that the probability of error could be made to decrease exponentially with code block length at rates less than capacity. ‘Probability of error for optimal codes in a Gaussian channel’ [35] was also concerned with the exponential dependence of error probability on block length. Here, Shannon introduced the ‘reliability function,’ the exponent of the minimum achievable probability of error as a function of signaling rate. ‘Coding theorems for a discrete source with a fidelity criterion’ [34] is an expansion of the results at the end of ‘A mathematical theory of communication’. Shannon began here with a simple discrete source with independent identically distributed letters and a single-letter distortion measure, and gave a simple and detailed proof of the rate-distortion
theorem. He then generalized to more general sources and distortion measures, finally including analog sources.

After 1950 the field of communication theory exploded, with the publication of numerous articles, both technical and popular [27-30, 36-39]. “The impact of Shannon’s theory of information on the development of telecommunication has been immense … The notion that a channel has a specific information capacity, which can be measured in bits per second, has had a profound influence. On the one hand, this notion offers the promise, at least in theory, of communication systems with frequency of errors as small as desired for a given channel for any data rate less than the channel capacity. Moreover, Shannon’s associated existence proof provided tantalizing insight into how ideal communication systems might someday fulfill the promise. On the other hand, this notion also clearly establishes a limit on the communication rate that can be achieved over a channel, offering communication engineers the ultimate benchmark with which to calibrate progress toward
construction of the ultimate communication system for a given channel. The fact that a specific capacity can be reached, and that no data transmission system can exceed this capacity, has been the holy grail of modern design for the last fifty years. Without the guidance of Shannon’s capacity formula, modern designers would have stumbled more often and proceeded more slowly. Communication systems ranging from deep-space satellite links to storage devices such as magnetic tapes and ubiquitous compact discs, and from high-speed internets to broadcast high-definition television, came sooner and in better form because of his work” (Blahut & Hajek, Foreword to the book edition of "A mathematical theory of communication').

Please see the posted images and click on the pdf below the images for the inventory of the collection.

Price: $90,000 .