252-0055-00L Information Theory
|Spring Semester 2021
|J. M. Buhmann
|yearly recurring course
|Language of instruction
|The course covers the fundamental concepts of Shannon's information theory.
The most important topics are: Entropy, information, data compression, channel coding, codes.
|The goal of the course is to familiarize with the theoretical fundamentals of information theory and to illustrate the practical use of the theory with the help of selected examples of data compression and coding.
|Introduction and motivation, basics of probability theory, entropy and information, Kraft inequality, bounds on expected length of source codes, Huffman coding, asymptotic equipartition property and typical sequences, Shannon's source coding theorem, channel capacity and channel coding, Shannon's noisy channel coding theorem, examples
|T. Cover, J. Thomas: Elements of Information Theory, John Wiley, 1991.
D. MacKay, Information Theory, Inference and Learning Algorithms, Cambridge University Press, 2003.
C. Shannon, The Mathematical Theory of Communication, 1948.