Name | Prof. Dr. Helmut Bölcskei |
Field | Mathematical Information Science |
Address | Professur Math. Informationswiss. ETH Zürich, ETF E 122 Sternwartstrasse 7 8092 Zürich SWITZERLAND |
Telephone | +41 44 632 34 33 |
hboelcskei@ethz.ch | |
URL | https://www.mins.ee.ethz.ch/people/show/boelcskei |
Department | Information Technology and Electrical Engineering |
Relationship | Full Professor |
Number | Title | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|
227-0045-00L | Signals and Systems I | 4 credits | 2V + 2U | H. Bölcskei | |
Abstract | Signal theory and systems theory (continuous-time and discrete-time): Signal analysis in the time and frequency domains, signal spaces, Hilbert spaces, generalized functions, linear time-invariant systems, sampling theorems, discrete-time signals and systems, digital filter structures, Discrete Fourier Transform (DFT), finite-dimensional signals and systems, Fast Fourier Transform (FFT). | ||||
Learning objective | Introduction to mathematical signal processing and system theory. | ||||
Content | Signal theory and systems theory (continuous-time and discrete-time): Signal analysis in the time and frequency domains, signal spaces, Hilbert spaces, generalized functions, linear time-invariant systems, sampling theorems, discrete-time signals and systems, digital filter structures, Discrete Fourier Transform (DFT), finite-dimensional signals and systems, Fast Fourier Transform (FFT). | ||||
Lecture notes | Lecture notes, problem set with solutions. | ||||
227-0423-00L | Neural Network Theory Does not take place this semester. | 4 credits | 2V + 1U | H. Bölcskei | |
Abstract | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, fundamental limits of deep neural network learning, VC dimension. | ||||
Learning objective | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks. | ||||
Content | 1. Universal approximation with single- and multi-layer networks 2. Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory 3. Fundamental limits of deep neural network learning 4. Geometry of decision surfaces 5. Separating capacity of nonlinear decision surfaces 6. Vapnik-Chervonenkis (VC) dimension 7. VC dimension of neural networks 8. Generalization error in neural network learning | ||||
Lecture notes | Detailed lecture notes are available on the course web page https://www.mins.ee.ethz.ch/teaching/nnt/ | ||||
Prerequisites / Notice | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. | ||||
401-5680-00L | Foundations of Data Science Seminar | 0 credits | P. L. Bühlmann, A. Bandeira, H. Bölcskei, S. van de Geer, F. Yang | ||
Abstract | Research colloquium | ||||
Learning objective |