The course covers the fundamental concepts of Shannon's information theory. The most important topics are: Entropy, information, data compression, channel coding, codes.
Learning objective
The goal of the course is to familiarize with the theoretical fundamentals of information theory and to illustrate the practical use of the theory with the help of selected examples of data compression and coding.
Content
Introduction and motivation, basics of probability theory, entropy and information, Kraft inequality, bounds on expected length of source codes, Huffman coding, asymptotic equipartition property and typical sequences, Shannon's source coding theorem, channel capacity and channel coding, Shannon's noisy channel coding theorem, examples
Literature
T. Cover, J. Thomas: Elements of Information Theory, John Wiley, 1991.
D. MacKay, Information Theory, Inference and Learning Algorithms, Cambridge University Press, 2003.
C. Shannon, The Mathematical Theory of Communication, 1948.
Performance assessment
Performance assessment information (valid until the course unit is held again)