Suchergebnis: Katalogdaten im Herbstsemester 2020
Mathematik Master | ||||||
Anwendungsgebiet Nur für das Master-Diplom in Angewandter Mathematik erforderlich und anrechenbar. In der Kategorie Anwendungsgebiet für den Master in Angewandter Mathematik muss eines der zur Auswahl stehenden Anwendungsgebiete gewählt werden. Im gewählten Anwendungsgebiet müssen mindestens 8 KP erworben werden. | ||||||
Information and Communication Technology | ||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |
---|---|---|---|---|---|---|
227-0105-00L | Introduction to Estimation and Machine Learning | W | 6 KP | 4G | H.‑A. Loeliger | |
Kurzbeschreibung | Mathematical basics of estimation and machine learning, with a view towards applications in signal processing. | |||||
Lernziel | Students master the basic mathematical concepts and algorithms of estimation and machine learning. | |||||
Inhalt | Review of probability theory; basics of statistical estimation; least squares and linear learning; Hilbert spaces; Gaussian random variables; singular-value decomposition; kernel methods, neural networks, and more | |||||
Skript | Lecture notes will be handed out as the course progresses. | |||||
Voraussetzungen / Besonderes | solid basics in linear algebra and probability theory | |||||
227-0101-00L | Discrete-Time and Statistical Signal Processing | W | 6 KP | 4G | H.‑A. Loeliger | |
Kurzbeschreibung | The course introduces some fundamental topics of digital signal processing with a bias towards applications in communications: discrete-time linear filters, inverse filters and equalization, DFT, discrete-time stochastic processes, elements of detection theory and estimation theory, LMMSE estimation and LMMSE filtering, LMS algorithm, Viterbi algorithm. | |||||
Lernziel | The course introduces some fundamental topics of digital signal processing with a bias towards applications in communications. The two main themes are linearity and probability. In the first part of the course, we deepen our understanding of discrete-time linear filters. In the second part of the course, we review the basics of probability theory and discrete-time stochastic processes. We then discuss some basic concepts of detection theory and estimation theory, as well as some practical methods including LMMSE estimation and LMMSE filtering, the LMS algorithm, and the Viterbi algorithm. A recurrent theme throughout the course is the stable and robust "inversion" of a linear filter. | |||||
Inhalt | 1. Discrete-time linear systems and filters: state-space realizations, z-transform and spectrum, decimation and interpolation, digital filter design, stable realizations and robust inversion. 2. The discrete Fourier transform and its use for digital filtering. 3. The statistical perspective: probability, random variables, discrete-time stochastic processes; detection and estimation: MAP, ML, Bayesian MMSE, LMMSE; Wiener filter, LMS adaptive filter, Viterbi algorithm. | |||||
Skript | Lecture Notes | |||||
227-0417-00L | Information Theory I | W | 6 KP | 4G | A. Lapidoth | |
Kurzbeschreibung | This course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity. | |||||
Lernziel | The fundamentals of Information Theory including Shannon's source coding and channel coding theorems | |||||
Inhalt | The entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity | |||||
Literatur | T.M. Cover and J. Thomas, Elements of Information Theory (second edition) |
- Seite 1 von 1