Christoph Schwab: Catalogue data in Autumn Semester 2018

Name Prof. Dr. Christoph Schwab
FieldMathematik
Address
Seminar für Angewandte Mathematik
ETH Zürich, HG G 57.1
Rämistrasse 101
8092 Zürich
SWITZERLAND
Telephone+41 44 632 35 95
Fax+41 44 632 10 85
E-mailchristoph.schwab@sam.math.ethz.ch
URLhttp://www.sam.math.ethz.ch/~schwab
DepartmentMathematics
RelationshipFull Professor

NumberTitleECTSHoursLecturers
401-3650-68LNumerical Analysis Seminar: Mathematics of Deep Neural Network Approximation Restricted registration - show details
Number of participants limited to 6.
4 credits2SC. Schwab
AbstractThis seminar will review recent (2016-) _mathematical results_ on approximation power of deep neural networks (DNNs). The focus will be on mathematical proof techniques to obtain approximation rate estimates (in terms of neural network size and connectivity) on various classes of input data.
Objective
ContentPresentation of the Seminar:
Deep Neural Networks (DNNs) have recently attracted substantial
interest and attention due to outperforming the best established
techniques in a number of application areas (Chess, Go, autonomous
driving, language translation, image classification, etc.).
In many cases, these successes have been achieved by
implementations, based on heuristics, with massive compute power
and training data.

This seminar will review recent (2016-) _mathematical results_
on approximation power of deep neural networks (DNNs).
The focus will be on mathematical proof techniques to
obtain approximation rate estimates (in terms of neural network
size and connectivity) on various classes of input data.
Also here, there is mounting mathematical evidence that DNNs
equalize or outperform the best known mathematical results.

Particular cases comprise:
high-dimensional parametric maps,
analytic and holomorphic maps,
maps containing multi-scale features which arise as solution classes from PDEs,
classes of maps which are invariant under group actions.

The format will be oral student presentations in December 2018
based on a recent research paper selected in two meetings
at the start of the semester.
LiteraturePartial reading list:

arXiv:1809.07669
DNN Expression Rate Analysis of High-dimensional PDEs: Application to
Option Pricing
Authors: Dennis Elbrächter, Philipp Grohs, Arnulf Jentzen, Christoph Schwab

arXiv:1806.08459
Topological properties of the set of functions generated by neural networks of fixed size
Authors: Philipp Petersen, Mones Raslan, Felix Voigtlaender

arXiv:1804.10306
Universal approximations of invariant maps by neural networks
Author: Dmitry Yarotsky

arXiv:1802.03620
Optimal approximation of continuous functions by very deep ReLU networks
Author: Dmitry Yarotsky

arXiv:1709.05289
Optimal approximation of piecewise smooth functions using deep ReLU neural networks
Authors: Philipp Petersen, Felix Voigtlaender

arXiv:1706.03301
Neural networks and rational functions
Author: Matus Telgarsky

arXiv:1705.05502
The power of deeper networks for expressing natural functions
Authors: David Rolnick, Max Tegmark

arXiv:1705.01365
Quantified advantage of discontinuous weight selection in approximations with deep neural networks
Author: Dmitry Yarotsky

arXiv:1610.01145
Error bounds for approximations with deep ReLU networks
Author: Dmitry Yarotsky

arXiv:1608.03287
Deep vs. shallow networks : An approximation theory perspective
Authors: Hrushikesh Mhaskar, Tomaso Poggio

arXiv:1602.04485
Benefits of depth in neural networks
Author: Matus Telgarsky
Prerequisites / NoticeEach seminar topic will allow expansion to a semester or a
master thesis in the MSc MATH or MSc Applied MATH.

Disclaimer:
The seminar will _not_ address recent developments in DNN
software, such as training heuristics, or programming techniques
for various specific applications.
401-5650-00LZurich Colloquium in Applied and Computational Mathematics Information 0 credits2KR. Abgrall, R. Alaifari, H. Ammari, R. Hiptmair, A. Jentzen, C. Jerez Hanckes, S. Mishra, S. Sauter, C. Schwab
AbstractResearch colloquium
Objective