The definition of information goes back to Shannons landmark paper in 1948. His theory of how information can be processed is the basis of all efficient digital communication systems both today and tomorrow. This course provides an up-to-date introduction to topic information theory. The course emphasizes both the formal development of the theory and the engineering implications for the design of communication systems and other information handling systems. The course includes:
* Shannon's information measure and its relatives, both for the discrete and continuous case.
* Three fundamental information theorems: Typical sequences, Source coding theorem and Channel coding theorem.
* Source coding: Optimal coding and construction of Huffman codes, as well as universal souce coding such as Ziv-Lempel coding (zip, etc.).
* Channel coding: Principles of error detection and correction on a noisy channel, mainly illustrated by Hamming codes.
* Gaussian channel: Continuous sources and additive white noise over both band limited and frequency selective channels, as wellas the multi-dimensional Gauss channel for MIMO systems. Derivation of the fundamental Shanon limit.
* Discrete input Gaussian channel: Maximum achievable rates for PAM and QAM, Coding and Shaping gain, and SNR gap.