Springer, 2015. — 471 p. — (Foundations in Signal Processing, Communications and Networking; 11). — ISBN: 978-3-319-12522-0.
Classical information processing consists of the main tasks of gaining knowledge, storage, transmission, and hiding data.
The first named task is the prime goal of Statistics and for the next two Shannon presented an impressive mathematical theory, called Information Theory, which he based on probabilistic models.
The basics in this theory are concepts of codes — lossless and lossy — with small error probabilities in spite of noise in the transmission, which is modeled by channels.
Another way to deal with noise is based on a combinatorial concept of error correcting codes, pioneered by Hamming. This leads to another way to look at Information Theory, which instead of being looked at by its tasks can also be classified by its mathematical structures and methods: primarily probabilistic versus combinatorial.
Finally, Shannon also laid the foundations of a theory concerning hiding data, called Cryptology. Its task is in a sense dual to transmission and we therefore prefer to view it as a subfield of Information Theory.
Viewed by mathematical structures there is again already in Shannon’s work a probabilistic and a combinatorial or complexity-theoretical model.
The lectures are suitable for graduate students in Mathematics, and also in Theoretical Computer Science, Physics, and Electrical Engineering after some preparations in basic Mathematics.
The lectures can be selected for courses or supplements of courses in many ways.
Transmitting DataSpecial Channels
Algorithms for Computing Channel Capacities and Rate-Distortion Functions
Shannon's Model for Continuous Transmission
On Sliding-Block Codes
On λ-Capacities and Information Stability
Channels with Infinite Alphabets
Gaining DataSelected Topics of Information Theory and Mathematical Statistics
β-Biased Estimators in Data Compression