# Search result: Catalogue data in Autumn Semester 2020

Mathematics Master | ||||||

Application Area Only necessary and eligible for the Master degree in Applied Mathematics. One of the application areas specified must be selected for the category Application Area for the Master degree in Applied Mathematics. At least 8 credits are required in the chosen application area. | ||||||

Information and Communication Technology | ||||||

Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|

227-0105-00L | Introduction to Estimation and Machine Learning | W | 6 credits | 4G | H.‑A. Loeliger | |

Abstract | Mathematical basics of estimation and machine learning, with a view towards applications in signal processing. | |||||

Learning objective | Students master the basic mathematical concepts and algorithms of estimation and machine learning. | |||||

Content | Review of probability theory; basics of statistical estimation; least squares and linear learning; Hilbert spaces; Gaussian random variables; singular-value decomposition; kernel methods, neural networks, and more | |||||

Lecture notes | Lecture notes will be handed out as the course progresses. | |||||

Prerequisites / Notice | solid basics in linear algebra and probability theory | |||||

227-0101-00L | Discrete-Time and Statistical Signal Processing | W | 6 credits | 4G | H.‑A. Loeliger | |

Abstract | The course introduces some fundamental topics of digital signal processing with a bias towards applications in communications: discrete-time linear filters, inverse filters and equalization, DFT, discrete-time stochastic processes, elements of detection theory and estimation theory, LMMSE estimation and LMMSE filtering, LMS algorithm, Viterbi algorithm. | |||||

Learning objective | The course introduces some fundamental topics of digital signal processing with a bias towards applications in communications. The two main themes are linearity and probability. In the first part of the course, we deepen our understanding of discrete-time linear filters. In the second part of the course, we review the basics of probability theory and discrete-time stochastic processes. We then discuss some basic concepts of detection theory and estimation theory, as well as some practical methods including LMMSE estimation and LMMSE filtering, the LMS algorithm, and the Viterbi algorithm. A recurrent theme throughout the course is the stable and robust "inversion" of a linear filter. | |||||

Content | 1. Discrete-time linear systems and filters: state-space realizations, z-transform and spectrum, decimation and interpolation, digital filter design, stable realizations and robust inversion. 2. The discrete Fourier transform and its use for digital filtering. 3. The statistical perspective: probability, random variables, discrete-time stochastic processes; detection and estimation: MAP, ML, Bayesian MMSE, LMMSE; Wiener filter, LMS adaptive filter, Viterbi algorithm. | |||||

Lecture notes | Lecture Notes | |||||

227-0417-00L | Information Theory I | W | 6 credits | 4G | A. Lapidoth | |

Abstract | This course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity. | |||||

Learning objective | The fundamentals of Information Theory including Shannon's source coding and channel coding theorems | |||||

Content | The entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity | |||||

Literature | T.M. Cover and J. Thomas, Elements of Information Theory (second edition) |

- Page 1 of 1