Springer, 2014. — 304 p. — (Foundations in Signal Processing, Communications and Networking; 10). — ISBN: 978-3-319-05479-7.
Classical information processing consists of the main tasks gaining knowledge, storage, transmission, and hiding data.
The first named task is the prime goal of Statistics and for the two next Shannon presented an impressive mathematical theory, called Information Theory, which he based on probabilistic models.
Based on this theory are concepts of codes — lossless and lossy — with small error probabilities in spite of noise in the transmission, which is modeled by channels.
Another way to deal with noise is based on a combinatorial concept of error correcting codes, pioneered by Hamming. This leads to another way to look at Information Theory, which instead of being looked at by its tasks can be also classified by its mathematical structures and methods: primarily probabilistic versus combinatorial.
Finally, Shannon also laid foundations of a theory concerning hiding data, called Cryptology. Its task is in a sense dual to transmission and we therefore prefer to view it as a subfield of Information Theory.
Viewed by mathematical structures there is again already in Shannon’s work a probabilistic and a combinatorial or complexity-theoretical model.
The lectures are suitable for graduate students in Mathematics, and also in Theoretical Computer Science, Physics, and Electrical Engineering after some preparations in basic Mathematics.
The lectures can be selected for courses or supplements of courses in many ways.
Part I Storing Data.
Data Compression.
The Entropy as a Measure of Uncertainty.
Universal Coding.
Part II Transmitting Data.
Coding Theorems and Converses for the DMC.
Towards Capacity Functions.
Error Bounds.
Part III Appendix.
Inequalities.
Rudolf Ahlswede 1938–2010.
Comments by Holger Boche.