## Erwin Riegler: Catalogue data in Autumn Semester 2019 |

Name | Dr. Erwin Riegler |

Address | Professur Math. Informationswiss. ETH Zürich, ETF E 120 Sternwartstrasse 7 8092 Zürich SWITZERLAND |

Telephone | +41 44 633 81 77 |

eriegler@mins.ee.ethz.ch | |

Department | Information Technology and Electrical Engineering |

Relationship | Lecturer |

Number | Title | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|

227-0423-00L | Neural Network Theory | 4 credits | 2V + 1U | H. Bölcskei, E. Riegler | |

Abstract | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, reproducing Kernel Hilbert spaces, support vector machines, fundamental limits of deep neural network learning, dimension measures, feature extraction with scattering networks | ||||

Objective | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks. | ||||

Content | 1. Universal approximation with single- and multi-layer networks 2. Geometry of decision surfaces 3. Separating capacity of nonlinear decision surfaces 4. Generalization 5. Reproducing Kernel Hilbert Spaces, support vector machines 6. Deep neural network approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, covering numbers, fundamental limits of deep neural network learning 7. Learning of real-valued functions: Pseudo-dimension, fat-shattering dimension, Vapnik-Chervonenkis dimension 8. Scattering networks | ||||

Lecture notes | Detailed lecture notes will be provided as we go along. | ||||

Prerequisites / Notice | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. |