Search result: Catalogue data in Autumn Semester 2021
Electrical Engineering and Information Technology Master ![]() | ||||||
![]() | ||||||
![]() ![]() The core courses and specialisation courses below are a selection for students who wish to specialise in the area of "Signal Processing and Machine Learning ", see https://www.ee.ethz.ch/studies/main-master/areas-of-specialisation.html. The individual study plan is subject to the tutor's approval. | ||||||
![]() ![]() ![]() These core courses are particularly recommended for the field of "Signal Processing and Machine Learning". You may choose core courses form other fields in agreement with your tutor. A minimum of 24 credits must be obtained from core courses during the MSc EEIT. | ||||||
![]() ![]() ![]() ![]() Advanced core courses bring students to gain in-depth knowledge of the chosen specialization. They are MSc level only. | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|
227-0423-00L | Neural Network Theory ![]() | W | 4 credits | 2V + 1U | H. Bölcskei | |
Abstract | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, fundamental limits of deep neural network learning, VC dimension. | |||||
Learning objective | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks. | |||||
Content | 1. Universal approximation with single- and multi-layer networks 2. Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory 3. Fundamental limits of deep neural network learning 4. Geometry of decision surfaces 5. Separating capacity of nonlinear decision surfaces 6. Vapnik-Chervonenkis (VC) dimension 7. VC dimension of neural networks 8. Generalization error in neural network learning | |||||
Lecture notes | Detailed lecture notes are available on the course web page https://www.mins.ee.ethz.ch/teaching/nnt/ | |||||
Prerequisites / Notice | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. | |||||
227-0427-00L | Signal Analysis, Models, and Machine Learning Does not take place this semester. This course was replaced by "Introduction to Estimation and Machine Learning" and "Advanced Signal Analysis, Modeling, and Machine Learning". | W | 6 credits | 4G | H.‑A. Loeliger | |
Abstract | Mathematical methods in signal processing and machine learning. I. Linear signal representation and approximation: Hilbert spaces, LMMSE estimation, regularization and sparsity. II. Learning linear and nonlinear functions and filters: neural networks, kernel methods. III. Structured statistical models: hidden Markov models, factor graphs, Kalman filter, Gaussian models with sparse events. | |||||
Learning objective | The course is an introduction to some basic topics in signal processing and machine learning. | |||||
Content | Part I - Linear Signal Representation and Approximation: Hilbert spaces, least squares and LMMSE estimation, projection and estimation by linear filtering, learning linear functions and filters, L2 regularization, L1 regularization and sparsity, singular-value decomposition and pseudo-inverse, principal-components analysis. Part II - Learning Nonlinear Functions: fundamentals of learning, neural networks, kernel methods. Part III - Structured Statistical Models and Message Passing Algorithms: hidden Markov models, factor graphs, Gaussian message passing, Kalman filter and recursive least squares, Monte Carlo methods, parameter estimation, expectation maximization, linear Gaussian models with sparse events. | |||||
Lecture notes | Lecture notes. | |||||
Prerequisites / Notice | Prerequisites: - local bachelors: course "Discrete-Time and Statistical Signal Processing" (5. Sem.) - others: solid basics in linear algebra and probability theory | |||||
227-0447-00L | Image Analysis and Computer Vision ![]() | W | 6 credits | 3V + 1U | L. Van Gool, E. Konukoglu, F. Yu | |
Abstract | Light and perception. Digital image formation. Image enhancement and feature extraction. Unitary transformations. Color and texture. Image segmentation. Motion extraction and tracking. 3D data extraction. Invariant features. Specific object recognition and object class recognition. Deep learning and Convolutional Neural Networks. | |||||
Learning objective | Overview of the most important concepts of image formation, perception and analysis, and Computer Vision. Gaining own experience through practical computer and programming exercises. | |||||
Content | This course aims at offering a self-contained account of computer vision and its underlying concepts, including the recent use of deep learning. The first part starts with an overview of existing and emerging applications that need computer vision. It shows that the realm of image processing is no longer restricted to the factory floor, but is entering several fields of our daily life. First the interaction of light with matter is considered. The most important hardware components such as cameras and illumination sources are also discussed. The course then turns to image discretization, necessary to process images by computer. The next part describes necessary pre-processing steps, that enhance image quality and/or detect specific features. Linear and non-linear filters are introduced for that purpose. The course will continue by analyzing procedures allowing to extract additional types of basic information from multiple images, with motion and 3D shape as two important examples. Finally, approaches for the recognition of specific objects as well as object classes will be discussed and analyzed. A major part at the end is devoted to deep learning and AI-based approaches to image analysis. Its main focus is on object recognition, but also other examples of image processing using deep neural nets are given. | |||||
Lecture notes | Course material Script, computer demonstrations, exercises and problem solutions | |||||
Prerequisites / Notice | Prerequisites: Basic concepts of mathematical analysis and linear algebra. The computer exercises are based on Python and Linux. The course language is English. | |||||
252-0535-00L | Advanced Machine Learning ![]() | W | 10 credits | 3V + 2U + 4A | J. M. Buhmann, C. Cotrini Jimenez | |
Abstract | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||
Learning objective | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||
Content | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||
Lecture notes | No lecture notes, but slides will be made available on the course webpage. | |||||
Literature | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||
Prerequisites / Notice | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. | |||||
263-3210-00L | Deep Learning ![]() ![]() Number of participants limited to 320. | W | 8 credits | 3V + 2U + 2A | F. Perez Cruz, A. Lucchi | |
Abstract | Deep learning is an area within machine learning that deals with algorithms and models that automatically induce multi-level data representations. | |||||
Learning objective | In recent years, deep learning and deep networks have significantly improved the state-of-the-art in many application domains such as computer vision, speech recognition, and natural language processing. This class will cover the mathematical foundations of deep learning and provide insights into model design, training, and validation. The main objective is a profound understanding of why these methods work and how. There will also be a rich set of hands-on tasks and practical projects to familiarize students with this emerging technology. | |||||
Prerequisites / Notice | This is an advanced level course that requires some basic background in machine learning. More importantly, students are expected to have a very solid mathematical foundation, including linear algebra, multivariate calculus, and probability. The course will make heavy use of mathematics and is not (!) meant to be an extended tutorial of how to train deep networks with tools like Torch or Tensorflow, although that may be a side benefit. The participation in the course is subject to the following condition: - Students must have taken the exam in Advanced Machine Learning (252-0535-00) or have acquired equivalent knowledge, see exhaustive list below: Advanced Machine Learning https://ml2.inf.ethz.ch/courses/aml/ Computational Intelligence Lab http://da.inf.ethz.ch/teaching/2019/CIL/ Introduction to Machine Learning https://las.inf.ethz.ch/teaching/introml-S19 Statistical Learning Theory http://ml2.inf.ethz.ch/courses/slt/ Computational Statistics https://stat.ethz.ch/lectures/ss19/comp-stats.php Probabilistic Artificial Intelligence https://las.inf.ethz.ch/teaching/pai-f18 | |||||
401-4944-20L | Mathematics of Data Science | W | 8 credits | 4G | A. Sousa Bandeira | |
Abstract | Mostly self-contained, but fast-paced, introductory masters level course on various theoretical aspects of algorithms that aim to extract information from data. | |||||
Learning objective | Introduction to various mathematical aspects of Data Science. | |||||
Content | These topics lie in overlaps of (Applied) Mathematics with: Computer Science, Electrical Engineering, Statistics, and/or Operations Research. Each lecture will feature a couple of Mathematical Open Problem(s) related to Data Science. The main mathematical tools used will be Probability and Linear Algebra, and a basic familiarity with these subjects is required. There will also be some (although knowledge of these tools is not assumed) Graph Theory, Representation Theory, Applied Harmonic Analysis, among others. The topics treated will include Dimension reduction, Manifold learning, Sparse recovery, Random Matrices, Approximation Algorithms, Community detection in graphs, and several others. | |||||
Lecture notes | https://people.math.ethz.ch/~abandeira/BandeiraSingerStrohmer-MDS-draft.pdf | |||||
Prerequisites / Notice | The main mathematical tools used will be Probability, Linear Algebra (and real analysis), and a working knowledge of these subjects is required. In addition to these prerequisites, this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs. We encourage students who are interested in mathematical data science to take both this course and ``227-0434-10L Mathematics of Information'' taught by Prof. H. Bölcskei. The two courses are designed to be complementary. A. Bandeira and H. Bölcskei |
Page 1 of 1