# Search result: Catalogue data in Autumn Semester 2020

Statistics Master The following courses belong to the curriculum of the Master's Programme in Statistics. The corresponding credits do not count as external credits even for course units where an enrolment at ETH Zurich is not possible. | ||||||

Master Studies (Programme Regulations 2014) | ||||||

Specialization Areas and Electives | ||||||

Statistical and Mathematical Courses | ||||||

Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|

401-3601-00L | Probability Theory At most one of the three course units (Bachelor Core Courses) 401-3461-00L Functional Analysis I 401-3531-00L Differential Geometry I 401-3601-00L Probability Theory can be recognised for the Master's degree in Mathematics or Applied Mathematics. In this case, you cannot change the category assignment by yourself in myStudies but must take contact with the Study Administration Office (www.math.ethz.ch/studiensekretariat) after having received the credits. | W | 10 credits | 4V + 1U | A.‑S. Sznitman | |

Abstract | Basics of probability theory and the theory of stochastic processes in discrete time | |||||

Objective | This course presents the basics of probability theory and the theory of stochastic processes in discrete time. The following topics are planned: Basics in measure theory, random series, law of large numbers, weak convergence, characteristic functions, central limit theorem, conditional expectation, martingales, convergence theorems for martingales, Galton Watson chain, transition probability, Theorem of Ionescu Tulcea, Markov chains. | |||||

Content | This course presents the basics of probability theory and the theory of stochastic processes in discrete time. The following topics are planned: Basics in measure theory, random series, law of large numbers, weak convergence, characteristic functions, central limit theorem, conditional expectation, martingales, convergence theorems for martingales, Galton Watson chain, transition probability, Theorem of Ionescu Tulcea, Markov chains. | |||||

Lecture notes | available in electronic form. | |||||

Literature | R. Durrett, Probability: Theory and examples, Duxbury Press 1996 H. Bauer, Probability Theory, de Gruyter 1996 J. Jacod and P. Protter, Probability essentials, Springer 2004 A. Klenke, Wahrscheinlichkeitstheorie, Springer 2006 D. Williams, Probability with martingales, Cambridge University Press 1991 | |||||

401-3627-00L | High-Dimensional StatisticsDoes not take place this semester. | W | 4 credits | 2V | P. L. Bühlmann | |

Abstract | "High-Dimensional Statistics" deals with modern methods and theory for statistical inference when the number of unknown parameters is of much larger order than sample size. Statistical estimation and algorithms for complex models and aspects of multiple testing will be discussed. | |||||

Objective | Knowledge of methods and basic theory for high-dimensional statistical inference | |||||

Content | Lasso and Group Lasso for high-dimensional linear and generalized linear models; Additive models and many smooth univariate functions; Non-convex loss functions and l1-regularization; Stability selection, multiple testing and construction of p-values; Undirected graphical modeling | |||||

Literature | Peter Bühlmann and Sara van de Geer (2011). Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer Verlag. ISBN 978-3-642-20191-2. | |||||

Prerequisites / Notice | Knowledge of basic concepts in probability theory, and intermediate knowledge of statistics (e.g. a course in linear models or computational statistics). | |||||

401-3612-00L | Stochastic Simulation | W | 5 credits | 3G | F. Sigrist | |

Abstract | This course introduces statistical Monte Carlo methods. This includes applications of stochastic simulation in various fields (statistics, statistical mechanics, operations research, financial mathematics), generating uniform and arbitrary random variables (incl. rejection and importance sampling), the accuracy of methods, variance reduction, quasi-Monte Carlo, and Markov chain Monte Carlo. | |||||

Objective | Students know the stochastic simulation methods introduced in this course. Students understand and can explain these methods, show how they are related to each other, know their weaknesses and strengths, apply them in practice, and proof key results. | |||||

Content | Examples of simulations in different fields (statistics, statistical mechanics, operations research, financial mathematics). Generation of uniform random variables. Generation of random variables with arbitrary distributions (including rejection sampling and importance sampling), simulation of multivariate normal variables and stochastic differential equations. The accuracy of Monte Carlo methods. Methods for variance reduction and quasi-Monte Carlo. Introduction to Markov chains and Markov chain Monte Carlo (Metropolis-Hastings, Gibbs sampler, Hamiltonian Monte Carlo, reversible jump MCMC). Algorithms introduced in the course are illustrated with the statistical software R. | |||||

Lecture notes | A script will be available in English. | |||||

Literature | P. Glasserman, Monte Carlo Methods in Financial Engineering. Springer 2004. B. D. Ripley. Stochastic Simulation. Wiley, 1987. Ch. Robert, G. Casella. Monte Carlo Statistical Methods. Springer 2004 (2nd edition). | |||||

Prerequisites / Notice | It is assumed that students have had an introduction to probability theory and statistics (random variables, joint and conditional distributions, law of large numbers, central limit theorem, basics of measure theory). The course resources (including script, slides, exercises) will be provided via the Moodle online learning platform. | |||||

401-4619-67L | Advanced Topics in Computational StatisticsDoes not take place this semester. | W | 4 credits | 2V | not available | |

Abstract | This lecture covers selected advanced topics in computational statistics. This year the focus will be on graphical modelling. | |||||

Objective | Students learn the theoretical foundations of the selected methods, as well as practical skills to apply these methods and to interpret their outcomes. | |||||

Content | The main focus will be on graphical models in various forms: Markov properties of undirected graphs; Belief propagation; Hidden Markov Models; Structure estimation and parameter estimation; inference for high-dimensional data; causal graphical models | |||||

Prerequisites / Notice | We assume a solid background in mathematics, an introductory lecture in probability and statistics, and at least one more advanced course in statistics. | |||||

401-4633-00L | Data Analytics in Organisations and Business | W | 5 credits | 2V + 1U | I. Flückiger | |

Abstract | On the end-to-end process of data analytics in organisations & business and how to transform data into insights for fact based decisions. Presentation of the process from the beginning with framing the business problem to presenting the results and making decisions by the use of data analytics. For each topic case studies from the financial service, healthcare and retail sectors will be presented. | |||||

Objective | The goal of this course is to give the students the understanding of the data analytics process in the business world, with special focus on the skills and techniques used besides the technical skills. The student will become familiar with the "business language", current problems and thinking in organisations and business and tools used. | |||||

Content | Framing the Business Problem Framing the Analytics Problem Data Methodology Model Building Deployment Model Lifecycle Soft Skills for the Statistical/Mathematical Professional | |||||

Lecture notes | Lecture Notes will be available. | |||||

Prerequisites / Notice | Prerequisites: Basic statistics and probability theory and regression | |||||

401-6217-00L | Using R for Data Analysis and Graphics (Part II) | W | 1.5 credits | 1G | M. Mächler | |

Abstract | The course provides the second part an introduction to the statistical software R for scientists. Topics are data generation and selection, graphical functions, important statistical functions, types of objects, models, programming and writing functions. Note: This part builds on "Using R... (Part I)", but can be taken independently if the basics of R are already known. | |||||

Objective | The students will be able to use the software R efficiently for data analysis, graphics and simple programming | |||||

Content | The course provides the second part of an introduction to the statistical software R (https://www.r-project.org/) for scientists. R is free software that contains a huge collection of functions with focus on statistics and graphics. If one wants to use R one has to learn the programming language R - on very rudimentary level. The course aims to facilitate this by providing a basic introduction to R. Part II of the course builds on part I and covers the following additional topics: - Elements of the R language: control structures (if, else, loops), lists, overview of R objects, attributes of R objects; - More on R functions; - Applying functions to elements of vectors, matrices and lists; - Object oriented programming with R: classes and methods; - Tayloring R: options - Extending basic R: packages The course focuses on practical work at the computer. We will make use of the graphical user interface RStudio: www.rstudio.org | |||||

Lecture notes | An Introduction to R. http://stat.ethz.ch/CRAN/doc/contrib/Lam-IntroductionToR_LHL.pdf | |||||

Prerequisites / Notice | Basic knowledge of R equivalent to "Using R .. (part 1)" ( = 401-6215-00L ) is a prerequisite for this course. The course resources will be provided via the Moodle web learning platform. Subscribing via Mystudies should *automatically* make you a student participant of the Moodle course of this lecture, which is at https://moodle-app2.let.ethz.ch/course/view.php?id=13500 ALL material is available on this moodle page. | |||||

401-0627-00L | Smoothing and Nonparametric Regression with Examples | W | 4 credits | 2G | S. Beran-Ghosh | |

Abstract | Starting with an overview of selected results from parametric inference, kernel smoothing will be introduced along with some asymptotic theory, optimal bandwidth selection, data driven algorithms and some special topics. Examples from environmental research will be used for motivation, but the methods will also be applicable elsewhere. | |||||

Objective | The students will learn about methods of kernel smoothing and application of concepts to data. The aim will be to build sufficient interest in the topic and intuition as well as the ability to implement the methods to various different datasets. | |||||

Content | Rough Outline: - Parametric estimation methods: selection of important results o Maximum likelihood, Method of Least squares: regression & diagnostics - Nonparametric curve estimation o Density estimation, Kernel regression, Local polynomials, Bandwidth selection o Selection of special topics (as time permits, we will cover as many topics as possible) such as rapid change points, mode estimation, robust smoothing, partial linear models, etc. - Applications: potential areas of applications will be discussed such as, change assessment, trend and surface estimation, probability and quantile curve estimation, and others. | |||||

Lecture notes | Brief summaries or outlines of some of the lecture material will be posted at https://www.wsl.ch/en/employees/ghosh.html. NOTE: The posted notes will tend to be just sketches whereas only the in-class lessons will contain complete information. LOG IN: In order to have access to the posted notes, you will need the course user id & the password. These will be given out on the first day of the lectures. | |||||

Literature | References: - Statistical Inference, by S.D. Silvey, Chapman & Hall. - Regression Analysis: Theory, Methods and Applications, by A. Sen and M. Srivastava, Springer. - Density Estimation, by B.W. Silverman, Chapman and Hall. - Nonparametric Simple Regression, by J. Fox, Sage Publications. - Applied Smoothing Techniques for Data Analysis: the Kernel Approach With S-Plus Illustrations, by A.W. Bowman, A. Azzalini, Oxford University Press. - Kernel Smoothing: Principles, Methods and Applications, by S. Ghosh, Wiley. Additional references will be given out in the lectures. | |||||

Prerequisites / Notice | Prerequisites: A background in Linear Algebra, Calculus, Probability & Statistical Inference including Estimation and Testing. | |||||

447-6221-00L | Nonparametric Regression Special Students "University of Zurich (UZH)" in the Master Program in Biostatistics at UZH cannot register for this course unit electronically. Forward the lecturer's written permission to attend to the Registrar's Office. Alternatively, the lecturer may also send an email directly to registrar@ethz.ch. The Registrar's Office will then register you for the course. | W | 1 credit | 1G | M. Mächler | |

Abstract | This course focusses on nonparametric estimation of probability densities and regression functions. These recent methods allow modelling without restrictive assumptions such as 'linear function'. These smoothing methods require a weight function and a smoothing parameter. Focus is on one dimension, higher dimensions and samples of curves are treated briefly. Exercises at the computer. | |||||

Objective | Knowledge on estimation of probability densities and regression functions via various statistical methods. Understanding of the choice of weight function and of the smoothing parameter, also done automatically. Practical application on data sets at the computer. | |||||

447-6233-00L | Spatial Statistics Special Students "University of Zurich (UZH)" in the Master Program in Biostatistics at UZH cannot register for this course unit electronically. Forward the lecturer's written permission to attend to the Registrar's Office. Alternatively, the lecturer may also send an email directly to registrar@ethz.ch. The Registrar's Office will then register you for the course. | W | 1 credit | 1G | A. J. Papritz | |

Abstract | In many research fields, spatially referenced data are collected. When analysing such data the focus is either on exploring their structure (dependence on explanatory variables, autocorrelation) and/or on spatial prediction. The course provides an introduction to geostatistical methods that are useful for such purposes. | |||||

Objective | The course will provide an overview of the basic concepts and stochastic models that are commonly used to model spatial data. In addition, the participants will learn a number of geostatistical techniques and acquire some familiarity with software that is useful for analysing spatial data. | |||||

Content | After an introductory discussion of the types of problems and the kind of data that arise in environmental research, an introduction into linear geostatistics (models: stationary and intrinsic random processes, modelling large-scale spatial patterns by regression, modelling autocorrelation by variogram; kriging: mean-square prediction of spatial data) will be taught. The lectures will be complemented by data analyses that the participants have to do themselves. | |||||

Lecture notes | Slides, descriptions of the problems for the data analyses and worked-out solutions to them will be provided. | |||||

Literature | P.J. Diggle & P.J. Ribeiro Jr. 2007. Model-based Geostatistics. Springer | |||||

447-6245-00L | Data Mining Special Students "University of Zurich (UZH)" in the Master Program in Biostatistics at UZH cannot register for this course unit electronically. Forward the lecturer's written permission to attend to the Registrar's Office. Alternatively, the lecturer may also send an email directly to registrar@ethz.ch. The Registrar's Office will then register you for the course. | W | 1 credit | 1G | M. Mächler | |

Abstract | Block course only on prediction problems, aka "supervised learning". Part 1, Classification: logistic regression, linear/quadratic discriminant analysis, Bayes classifier; additive and tree models; further flexible ("nonparametric") methods. Part 2, Flexible Prediction: additive models, MARS, Y-Transformation models (ACE,AVAS); Projection Pursuit Regression (PPR), neural nets. | |||||

Objective | ||||||

Content | "Data Mining" is a large field from which in this block course, we only treat so called prediction problems, aka "supervised learning". Part 1, Classification, recalls logistic regression and linear / quadratic discriminant analysis (LDA/QDA) and extends these (in the framework of 'Bayes classifier") to (generalized) additive (GAM) and tree models (CART), and further mentions other flexible ("nonparametric") methods. Part 2, Flexible Prediction (of continuous or "class" response/target) contains additive models, MARS, Y-Transformation models (ACE, AVAS); Projection Pursuit Regression (PPR), neural nets. | |||||

Lecture notes | The block course is based on (German language) lecture notes. | |||||

Prerequisites / Notice | The exercises are done exlusively with the (free, open source) software "R" (http://www.r-project.org). A final exam will also happen at the computers, using R (and your brains!). | |||||

447-6257-00L | Repeated Measures | W | 1 credit | 1G | L. Meier | |

Abstract | Generation and structure of repeated measures. Planning and realization of corresponding studies. Within- and between-subjects factors. Common covariance structures. Statistical analyses: graphical methods, summary statistics approach, univariate and multivariate ANOVA, linear mixed effects models. | |||||

Objective | Participants will gain the ability of recognizing repeated measures and to analyze them adequately. They will know how to deal with pseudoreplicates. | |||||

447-6191-00L | Statistical Analysis of Financial Data | W | 2 credits | 1G | M. Dettling, A. F. Ruckstuhl | |

Abstract | Distributions for financial data. Volatility models: ARCH- and GARCH models. Value at risk and expected shortfall. Portfolio theory: minimum-variance portfolio, efficient frontier, Sharpe’s ratio. Factor models: capital asset pricing model, macroeconomic factor models, fundamental factor model. Copulas: Basic theory, Gaussian and t-copulas, archimedean copulas, calibration of copulas. | |||||

Objective | Getting to know the typical properties of financial data and appropriate statistical models, incl. the corresponding functions in R. | |||||

447-6289-00L | Sampling Surveys | W | 2 credits | 1G | B. Hulliger | |

Abstract | The elements of a sample survey are explained. The most important classical sample designs (simple random sampling and stratified random sampling) with their estimation procedures and the use of auxiliary information including the Horvitz-Thompson estimator are introduced. Data preparation, non-response and its treatment, variance estimation and analysis of survey data is discussed. | |||||

Objective | Knowledge of the Elements and the process of a sample survey. Understanding of the paradigm of random samples. Knowledge of simple random samplinig and stratified random sampling and capability to apply the corresponding methods. Knowledge of further methods of sampling and estimation as well as data preparation and analysis. | |||||

Lecture notes | Introduction to the statistical methods of survey research | |||||

401-3628-14L | Bayesian StatisticsDoes not take place this semester. | W | 4 credits | 2V | ||

Abstract | Introduction to the Bayesian approach to statistics: decision theory, prior distributions, hierarchical Bayes models, empirical Bayes, Bayesian tests and model selection, empirical Bayes, Laplace approximation, Monte Carlo and Markov chain Monte Carlo methods. | |||||

Objective | Students understand the conceptual ideas behind Bayesian statistics and are familiar with common techniques used in Bayesian data analysis. | |||||

Content | Topics that we will discuss are: Difference between the frequentist and Bayesian approach (decision theory, principles), priors (conjugate priors, noninformative priors, Jeffreys prior), tests and model selection (Bayes factors, hyper-g priors for regression),hierarchical models and empirical Bayes methods, computational methods (Laplace approximation, Monte Carlo and Markov chain Monte Carlo methods) | |||||

Lecture notes | A script will be available in English. | |||||

Literature | Christian Robert, The Bayesian Choice, 2nd edition, Springer 2007. A. Gelman et al., Bayesian Data Analysis, 3rd edition, Chapman & Hall (2013). Additional references will be given in the course. | |||||

Prerequisites / Notice | Familiarity with basic concepts of frequentist statistics and with basic concepts of probability theory (random variables, joint and conditional distributions, laws of large numbers and central limit theorem) will be assumed. | |||||

447-6273-00L | Bayes Methods | W | 2 credits | 2G | Y.‑L. Grize | |

Abstract | conditional probability; bayes inference (conjugate distributions, HPD-areas; linear and empirical bayes); determination of the a-posteriori distribution through simulation (MCMC with R2Winbugs); introduction to multilevel/hierarchical models. | |||||

Objective | ||||||

Content | Bayes statistics is attractive, because it allows to make decisions under uncertainty where a classical frequentist statistical approach fails. The course provides an introduction into bayesian methods. It is moderately mathematically technical, but demands a flexibility of mind, which should not underestimated. | |||||

Literature | Gelman A., Carlin J.B., Stern H.S. and D.B. Rubin, Bayesian Data Analysis, Chapman and Hall, 2nd Edition, 2004. Kruschke, J.K., Doing Bayesian Data Analysis, Elsevier2011. | |||||

Prerequisites / Notice | Prerequisite:Basic knowledge of statistics; Knowledge of R. | |||||

401-3913-01L | Mathematical Foundations for Finance | W | 4 credits | 3V + 2U | M. Schweizer | |

Abstract | First introduction to main modelling ideas and mathematical tools from mathematical finance | |||||

Objective | This course gives a first introduction to the main modelling ideas and mathematical tools from mathematical finance. It aims mainly at non-mathematicians who need an introduction to the main tools from stochastics used in mathematical finance. However, mathematicians who want to learn some basic modelling ideas and concepts for quantitative finance (before continuing with a more advanced course) may also find this of interest. The main emphasis will be on ideas, but important results will be given with (sometimes partial) proofs. | |||||

Content | Topics to be covered include - financial market models in finite discrete time - absence of arbitrage and martingale measures - valuation and hedging in complete markets - basics about Brownian motion - stochastic integration - stochastic calculus: Itô's formula, Girsanov transformation, Itô's representation theorem - Black-Scholes formula | |||||

Lecture notes | Lecture notes will be made available at the beginning of the course. | |||||

Literature | Lecture notes will be made available at the beginning of the course. Additional (background) references are given there. | |||||

Prerequisites / Notice | Prerequisites: Results and facts from probability theory as in the book "Probability Essentials" by J. Jacod and P. Protter will be used freely. Especially participants without a direct mathematics background are strongly advised to familiarise themselves with those tools before (or very quickly during) the course. (A possible alternative to the above English textbook are the (German) lecture notes for the standard course "Wahrscheinlichkeitstheorie".) For those who are not sure about their background, we suggest to look at the exercises in Chapters 8, 9, 22-25, 28 of the Jacod/Protter book. If these pose problems, you will have a hard time during the course. So be prepared. | |||||

401-3901-00L | Mathematical Optimization | W | 11 credits | 4V + 2U | R. Zenklusen | |

Abstract | Mathematical treatment of diverse optimization techniques. | |||||

Objective | The goal of this course is to get a thorough understanding of various classical mathematical optimization techniques with an emphasis on polyhedral approaches. In particular, we want students to develop a good understanding of some important problem classes in the field, of structural mathematical results linked to these problems, and of solution approaches based on this structural understanding. | |||||

Content | Key topics include: - Linear programming and polyhedra; - Flows and cuts; - Combinatorial optimization problems and techniques; - Equivalence between optimization and separation; - Brief introduction to Integer Programming. | |||||

Literature | - Bernhard Korte, Jens Vygen: Combinatorial Optimization. 6th edition, Springer, 2018. - Alexander Schrijver: Combinatorial Optimization: Polyhedra and Efficiency. Springer, 2003. This work has 3 volumes. - Ravindra K. Ahuja, Thomas L. Magnanti, James B. Orlin. Network Flows: Theory, Algorithms, and Applications. Prentice Hall, 1993. - Alexander Schrijver: Theory of Linear and Integer Programming. John Wiley, 1986. | |||||

Prerequisites / Notice | Solid background in linear algebra. | |||||

252-0535-00L | Advanced Machine Learning | W | 10 credits | 3V + 2U + 4A | J. M. Buhmann, C. Cotrini Jimenez | |

Abstract | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||

Objective | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||

Content | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||

Lecture notes | No lecture notes, but slides will be made available on the course webpage. | |||||

Literature | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||

Prerequisites / Notice | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. | |||||

252-3005-00L | Natural Language Processing Number of participants limited to 200. | W | 5 credits | 2V + 1U + 1A | R. Cotterell | |

Abstract | This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||

Objective | The objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques. | |||||

Content | This course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||

Literature | Jacob Eisenstein: Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series) | |||||

227-0423-00L | Neural Network Theory | W | 4 credits | 2V + 1U | H. Bölcskei | |

Abstract | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, basics of approximation theory, fundamental limits of deep neural network learning, geometry of decision surfaces, capacity of separating surfaces, dimension measures relevant for generalization, VC dimension of neural networks. | |||||

Objective | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of (deep) neural networks. | |||||

Content | 1. Universal approximation with single- and multi-layer networks 2. Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory 3. Fundamental limits of deep neural network learning 4. Geometry of decision surfaces 5. Separating capacity of nonlinear decision surfaces 6. Dimension measures: Pseudo-dimension, fat-shattering dimension, Vapnik-Chervonenkis (VC) dimension 7. Dimensions of neural networks 8. Generalization error in neural network learning | |||||

Lecture notes | Detailed lecture notes will be provided. | |||||

Prerequisites / Notice | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. |

- Page 1 of 2 All