Helmut Bölcskei: Catalogue data in Autumn Semester 2019

Award: The Golden Owl
Name Prof. Dr. Helmut Bölcskei
FieldMathematical Information Science
Address
Professur Math. Informationswiss.
ETH Zürich, ETF E 122
Sternwartstrasse 7
8092 Zürich
SWITZERLAND
Telephone+41 44 632 34 33
E-mailhboelcskei@ethz.ch
URLhttps://www.mins.ee.ethz.ch/people/show/boelcskei
DepartmentInformation Technology and Electrical Engineering
RelationshipFull Professor

NumberTitleECTSHoursLecturers
227-0045-00LSignals and Systems I4 credits2V + 2UH. Bölcskei
AbstractSignal theory and systems theory (continuous-time and discrete-time): Signal analysis in the time and frequency domains, signal spaces, Hilbert spaces, generalized functions, linear time-invariant systems, sampling theorems, discrete-time signals and systems, digital filter structures, Discrete Fourier Transform (DFT), finite-dimensional signals and systems, Fast Fourier Transform (FFT).
ObjectiveIntroduction to mathematical signal processing and system theory.
ContentSignal theory and systems theory (continuous-time and discrete-time): Signal analysis in the time and frequency domains, signal spaces, Hilbert spaces, generalized functions, linear time-invariant systems, sampling theorems, discrete-time signals and systems, digital filter structures, Discrete Fourier Transform (DFT), finite-dimensional signals and systems, Fast Fourier Transform (FFT).
Lecture notesLecture notes, problem set with solutions.
227-0423-00LNeural Network Theory4 credits2V + 1UH. Bölcskei, E. Riegler
AbstractThe class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, reproducing Kernel Hilbert spaces, support vector machines, fundamental limits of deep neural network learning, dimension measures, feature extraction with scattering networks
ObjectiveAfter attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks.
Content1. Universal approximation with single- and multi-layer networks

2. Geometry of decision surfaces

3. Separating capacity of nonlinear decision surfaces

4. Generalization

5. Reproducing Kernel Hilbert Spaces, support vector machines

6. Deep neural network approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, covering numbers, fundamental limits of deep neural network learning

7. Learning of real-valued functions: Pseudo-dimension, fat-shattering dimension, Vapnik-Chervonenkis dimension

8. Scattering networks
Lecture notesDetailed lecture notes will be provided as we go along.
Prerequisites / NoticeThis course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular.
401-5680-00LFoundations of Data Science Seminar Information 0 creditsP. L. Bühlmann, A. Bandeira, H. Bölcskei, J. M. Buhmann, T. Hofmann, A. Krause, A. Lapidoth, H.‑A. Loeliger, M. H. Maathuis, G. Rätsch, C. Uhler, S. van de Geer
AbstractResearch colloquium
Objective