Suchergebnis: Katalogdaten im Herbstsemester 2019
Elektrotechnik und Informationstechnologie Master | ||||||
Master-Studium (Studienreglement 2008) | ||||||
Fächer der Vertiefung Insgesamt 42 KP müssen im Masterstudium aus Vertiefungsfächern erreicht werden. Der individuelle Studienplan unterliegt der Zustimmung eines Tutors. | ||||||
Signal Processing and Machine Learning | ||||||
Kernfächer | ||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |
---|---|---|---|---|---|---|
227-0423-00L | Neural Network Theory | W | 4 KP | 2V + 1U | H. Bölcskei, E. Riegler | |
Kurzbeschreibung | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, reproducing Kernel Hilbert spaces, support vector machines, fundamental limits of deep neural network learning, dimension measures, feature extraction with scattering networks | |||||
Lernziel | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks. | |||||
Inhalt | 1. Universal approximation with single- and multi-layer networks 2. Geometry of decision surfaces 3. Separating capacity of nonlinear decision surfaces 4. Generalization 5. Reproducing Kernel Hilbert Spaces, support vector machines 6. Deep neural network approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, covering numbers, fundamental limits of deep neural network learning 7. Learning of real-valued functions: Pseudo-dimension, fat-shattering dimension, Vapnik-Chervonenkis dimension 8. Scattering networks | |||||
Skript | Detailed lecture notes will be provided as we go along. | |||||
Voraussetzungen / Besonderes | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. | |||||
227-0427-00L | Signal Analysis, Models, and Machine Learning | W | 6 KP | 4G | H.‑A. Loeliger | |
Kurzbeschreibung | Mathematical methods in signal processing and machine learning. I. Linear signal representation and approximation: Hilbert spaces, LMMSE estimation, regularization and sparsity. II. Learning linear and nonlinear functions and filters: neural networks, kernel methods. III. Structured statistical models: hidden Markov models, factor graphs, Kalman filter, Gaussian models with sparse events. | |||||
Lernziel | The course is an introduction to some basic topics in signal processing and machine learning. | |||||
Inhalt | Part I - Linear Signal Representation and Approximation: Hilbert spaces, least squares and LMMSE estimation, projection and estimation by linear filtering, learning linear functions and filters, L2 regularization, L1 regularization and sparsity, singular-value decomposition and pseudo-inverse, principal-components analysis. Part II - Learning Nonlinear Functions: fundamentals of learning, neural networks, kernel methods. Part III - Structured Statistical Models and Message Passing Algorithms: hidden Markov models, factor graphs, Gaussian message passing, Kalman filter and recursive least squares, Monte Carlo methods, parameter estimation, expectation maximization, linear Gaussian models with sparse events. | |||||
Skript | Lecture notes. | |||||
Voraussetzungen / Besonderes | Prerequisites: - local bachelors: course "Discrete-Time and Statistical Signal Processing" (5. Sem.) - others: solid basics in linear algebra and probability theory | |||||
227-0447-00L | Image Analysis and Computer Vision | W | 6 KP | 3V + 1U | L. Van Gool, O. Göksel, E. Konukoglu | |
Kurzbeschreibung | Light and perception. Digital image formation. Image enhancement and feature extraction. Unitary transformations. Color and texture. Image segmentation. Motion extraction and tracking. 3D data extraction. Invariant features. Specific object recognition and object class recognition. Deep learning and Convolutional Neural Networks. | |||||
Lernziel | Overview of the most important concepts of image formation, perception and analysis, and Computer Vision. Gaining own experience through practical computer and programming exercises. | |||||
Inhalt | This course aims at offering a self-contained account of computer vision and its underlying concepts, including the recent use of deep learning. The first part starts with an overview of existing and emerging applications that need computer vision. It shows that the realm of image processing is no longer restricted to the factory floor, but is entering several fields of our daily life. First the interaction of light with matter is considered. The most important hardware components such as cameras and illumination sources are also discussed. The course then turns to image discretization, necessary to process images by computer. The next part describes necessary pre-processing steps, that enhance image quality and/or detect specific features. Linear and non-linear filters are introduced for that purpose. The course will continue by analyzing procedures allowing to extract additional types of basic information from multiple images, with motion and 3D shape as two important examples. Finally, approaches for the recognition of specific objects as well as object classes will be discussed and analyzed. A major part at the end is devoted to deep learning and AI-based approaches to image analysis. Its main focus is on object recognition, but also other examples of image processing using deep neural nets are given. | |||||
Skript | Course material Script, computer demonstrations, exercises and problem solutions | |||||
Voraussetzungen / Besonderes | Prerequisites: Basic concepts of mathematical analysis and linear algebra. The computer exercises are based on Python and Linux. The course language is English. | |||||
252-0535-00L | Advanced Machine Learning | W | 8 KP | 3V + 2U + 2A | J. M. Buhmann | |
Kurzbeschreibung | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||
Lernziel | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||
Inhalt | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||
Skript | No lecture notes, but slides will be made available on the course webpage. | |||||
Literatur | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||
Voraussetzungen / Besonderes | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. |
- Seite 1 von 1