Helmut Bölcskei: Catalogue data in Autumn Semester 2022

Award: The Golden Owl
Name Prof. Dr. Helmut Bölcskei
FieldMathematical Information Science
Address
Professur Math. Informationswiss.
ETH Zürich, ETF E 122
Sternwartstrasse 7
8092 Zürich
SWITZERLAND
Telephone+41 44 632 34 33
E-mailhboelcskei@ethz.ch
URLhttps://www.mins.ee.ethz.ch/people/show/boelcskei
DepartmentInformation Technology and Electrical Engineering
RelationshipFull Professor

NumberTitleECTSHoursLecturers
227-0045-00LSignals and Systems I Restricted registration - show details 4 credits2V + 2UH. Bölcskei
AbstractSignal theory and systems theory (continuous-time and discrete-time): Signal analysis in the time and frequency domains, signal spaces, Hilbert spaces, generalized functions, linear time-invariant systems, sampling theorems, discrete-time signals and systems, digital filter structures, Discrete Fourier Transform (DFT), finite-dimensional signals and systems, Fast Fourier Transform (FFT).
Learning objectiveIntroduction to mathematical signal processing and system theory.
ContentSignal theory and systems theory (continuous-time and discrete-time): Signal analysis in the time and frequency domains, signal spaces, Hilbert spaces, generalized functions, linear time-invariant systems, sampling theorems, discrete-time signals and systems, digital filter structures, Discrete Fourier Transform (DFT), finite-dimensional signals and systems, Fast Fourier Transform (FFT).
Lecture notesLecture notes, problem set with solutions.
227-0423-00LNeural Network Theory Information
Does not take place this semester.
4 credits2V + 1UH. Bölcskei
AbstractThe class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, fundamental limits of deep neural network learning, VC dimension.
Learning objectiveAfter attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks.
Content1. Universal approximation with single- and multi-layer networks

2. Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory

3. Fundamental limits of deep neural network learning

4. Geometry of decision surfaces

5. Separating capacity of nonlinear decision surfaces

6. Vapnik-Chervonenkis (VC) dimension

7. VC dimension of neural networks

8. Generalization error in neural network learning
Lecture notesDetailed lecture notes are available on the course web page
https://www.mins.ee.ethz.ch/teaching/nnt/
Prerequisites / NoticeThis course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular.
401-5680-00LFoundations of Data Science Seminar Information 0 creditsP. L. Bühlmann, A. Bandeira, H. Bölcskei, S. van de Geer, F. Yang
AbstractResearch colloquium
Learning objective