# Search result: Catalogue data in Autumn Semester 2020

Statistics Master The following courses belong to the curriculum of the Master's Programme in Statistics. The corresponding credits do not count as external credits even for course units where an enrolment at ETH Zurich is not possible. | ||||||

Master Studies (Programme Regulations 2020) | ||||||

Subject Specific Electives | ||||||

Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|

401-3601-00L | Probability Theory At most one of the three course units (Bachelor Core Courses) 401-3461-00L Functional Analysis I 401-3531-00L Differential Geometry I 401-3601-00L Probability Theory can be recognised for the Master's degree in Mathematics or Applied Mathematics. In this case, you cannot change the category assignment by yourself in myStudies but must take contact with the Study Administration Office (www.math.ethz.ch/studiensekretariat) after having received the credits. | W | 10 credits | 4V + 1U | A.‑S. Sznitman | |

Abstract | Basics of probability theory and the theory of stochastic processes in discrete time | |||||

Objective | This course presents the basics of probability theory and the theory of stochastic processes in discrete time. The following topics are planned: Basics in measure theory, random series, law of large numbers, weak convergence, characteristic functions, central limit theorem, conditional expectation, martingales, convergence theorems for martingales, Galton Watson chain, transition probability, Theorem of Ionescu Tulcea, Markov chains. | |||||

Content | This course presents the basics of probability theory and the theory of stochastic processes in discrete time. The following topics are planned: Basics in measure theory, random series, law of large numbers, weak convergence, characteristic functions, central limit theorem, conditional expectation, martingales, convergence theorems for martingales, Galton Watson chain, transition probability, Theorem of Ionescu Tulcea, Markov chains. | |||||

Lecture notes | available in electronic form. | |||||

Literature | R. Durrett, Probability: Theory and examples, Duxbury Press 1996 H. Bauer, Probability Theory, de Gruyter 1996 J. Jacod and P. Protter, Probability essentials, Springer 2004 A. Klenke, Wahrscheinlichkeitstheorie, Springer 2006 D. Williams, Probability with martingales, Cambridge University Press 1991 | |||||

401-3627-00L | High-Dimensional StatisticsDoes not take place this semester. | W | 4 credits | 2V | P. L. Bühlmann | |

Abstract | "High-Dimensional Statistics" deals with modern methods and theory for statistical inference when the number of unknown parameters is of much larger order than sample size. Statistical estimation and algorithms for complex models and aspects of multiple testing will be discussed. | |||||

Objective | Knowledge of methods and basic theory for high-dimensional statistical inference | |||||

Content | Lasso and Group Lasso for high-dimensional linear and generalized linear models; Additive models and many smooth univariate functions; Non-convex loss functions and l1-regularization; Stability selection, multiple testing and construction of p-values; Undirected graphical modeling | |||||

Literature | Peter Bühlmann and Sara van de Geer (2011). Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer Verlag. ISBN 978-3-642-20191-2. | |||||

Prerequisites / Notice | Knowledge of basic concepts in probability theory, and intermediate knowledge of statistics (e.g. a course in linear models or computational statistics). | |||||

401-3612-00L | Stochastic Simulation | W | 5 credits | 3G | F. Sigrist | |

Abstract | This course introduces statistical Monte Carlo methods. This includes applications of stochastic simulation in various fields (statistics, statistical mechanics, operations research, financial mathematics), generating uniform and arbitrary random variables (incl. rejection and importance sampling), the accuracy of methods, variance reduction, quasi-Monte Carlo, and Markov chain Monte Carlo. | |||||

Objective | Students know the stochastic simulation methods introduced in this course. Students understand and can explain these methods, show how they are related to each other, know their weaknesses and strengths, apply them in practice, and proof key results. | |||||

Content | Examples of simulations in different fields (statistics, statistical mechanics, operations research, financial mathematics). Generation of uniform random variables. Generation of random variables with arbitrary distributions (including rejection sampling and importance sampling), simulation of multivariate normal variables and stochastic differential equations. The accuracy of Monte Carlo methods. Methods for variance reduction and quasi-Monte Carlo. Introduction to Markov chains and Markov chain Monte Carlo (Metropolis-Hastings, Gibbs sampler, Hamiltonian Monte Carlo, reversible jump MCMC). Algorithms introduced in the course are illustrated with the statistical software R. | |||||

Lecture notes | A script will be available in English. | |||||

Literature | P. Glasserman, Monte Carlo Methods in Financial Engineering. Springer 2004. B. D. Ripley. Stochastic Simulation. Wiley, 1987. Ch. Robert, G. Casella. Monte Carlo Statistical Methods. Springer 2004 (2nd edition). | |||||

Prerequisites / Notice | It is assumed that students have had an introduction to probability theory and statistics (random variables, joint and conditional distributions, law of large numbers, central limit theorem, basics of measure theory). The course resources (including script, slides, exercises) will be provided via the Moodle online learning platform. | |||||

401-4619-67L | Advanced Topics in Computational StatisticsDoes not take place this semester. | W | 4 credits | 2V | not available | |

Abstract | This lecture covers selected advanced topics in computational statistics. This year the focus will be on graphical modelling. | |||||

Objective | Students learn the theoretical foundations of the selected methods, as well as practical skills to apply these methods and to interpret their outcomes. | |||||

Content | The main focus will be on graphical models in various forms: Markov properties of undirected graphs; Belief propagation; Hidden Markov Models; Structure estimation and parameter estimation; inference for high-dimensional data; causal graphical models | |||||

Prerequisites / Notice | We assume a solid background in mathematics, an introductory lecture in probability and statistics, and at least one more advanced course in statistics. | |||||

401-4633-00L | Data Analytics in Organisations and Business | W | 5 credits | 2V + 1U | I. Flückiger | |

Abstract | On the end-to-end process of data analytics in organisations & business and how to transform data into insights for fact based decisions. Presentation of the process from the beginning with framing the business problem to presenting the results and making decisions by the use of data analytics. For each topic case studies from the financial service, healthcare and retail sectors will be presented. | |||||

Objective | The goal of this course is to give the students the understanding of the data analytics process in the business world, with special focus on the skills and techniques used besides the technical skills. The student will become familiar with the "business language", current problems and thinking in organisations and business and tools used. | |||||

Content | Framing the Business Problem Framing the Analytics Problem Data Methodology Model Building Deployment Model Lifecycle Soft Skills for the Statistical/Mathematical Professional | |||||

Lecture notes | Lecture Notes will be available. | |||||

Prerequisites / Notice | Prerequisites: Basic statistics and probability theory and regression | |||||

401-6217-00L | Using R for Data Analysis and Graphics (Part II) | W | 1.5 credits | 1G | M. Mächler | |

Abstract | The course provides the second part an introduction to the statistical software R for scientists. Topics are data generation and selection, graphical functions, important statistical functions, types of objects, models, programming and writing functions. Note: This part builds on "Using R... (Part I)", but can be taken independently if the basics of R are already known. | |||||

Objective | The students will be able to use the software R efficiently for data analysis, graphics and simple programming | |||||

Content | The course provides the second part of an introduction to the statistical software R (https://www.r-project.org/) for scientists. R is free software that contains a huge collection of functions with focus on statistics and graphics. If one wants to use R one has to learn the programming language R - on very rudimentary level. The course aims to facilitate this by providing a basic introduction to R. Part II of the course builds on part I and covers the following additional topics: - Elements of the R language: control structures (if, else, loops), lists, overview of R objects, attributes of R objects; - More on R functions; - Applying functions to elements of vectors, matrices and lists; - Object oriented programming with R: classes and methods; - Tayloring R: options - Extending basic R: packages The course focuses on practical work at the computer. We will make use of the graphical user interface RStudio: www.rstudio.org | |||||

Lecture notes | An Introduction to R. http://stat.ethz.ch/CRAN/doc/contrib/Lam-IntroductionToR_LHL.pdf | |||||

Prerequisites / Notice | Basic knowledge of R equivalent to "Using R .. (part 1)" ( = 401-6215-00L ) is a prerequisite for this course. The course resources will be provided via the Moodle web learning platform. Subscribing via Mystudies should *automatically* make you a student participant of the Moodle course of this lecture, which is at https://moodle-app2.let.ethz.ch/course/view.php?id=13500 ALL material is available on this moodle page. | |||||

401-0627-00L | Smoothing and Nonparametric Regression with Examples | W | 4 credits | 2G | S. Beran-Ghosh | |

Abstract | Starting with an overview of selected results from parametric inference, kernel smoothing will be introduced along with some asymptotic theory, optimal bandwidth selection, data driven algorithms and some special topics. Examples from environmental research will be used for motivation, but the methods will also be applicable elsewhere. | |||||

Objective | The students will learn about methods of kernel smoothing and application of concepts to data. The aim will be to build sufficient interest in the topic and intuition as well as the ability to implement the methods to various different datasets. | |||||

Content | Rough Outline: - Parametric estimation methods: selection of important results o Maximum likelihood, Method of Least squares: regression & diagnostics - Nonparametric curve estimation o Density estimation, Kernel regression, Local polynomials, Bandwidth selection o Selection of special topics (as time permits, we will cover as many topics as possible) such as rapid change points, mode estimation, robust smoothing, partial linear models, etc. - Applications: potential areas of applications will be discussed such as, change assessment, trend and surface estimation, probability and quantile curve estimation, and others. | |||||

Lecture notes | Brief summaries or outlines of some of the lecture material will be posted at https://www.wsl.ch/en/employees/ghosh.html. NOTE: The posted notes will tend to be just sketches whereas only the in-class lessons will contain complete information. LOG IN: In order to have access to the posted notes, you will need the course user id & the password. These will be given out on the first day of the lectures. | |||||

Literature | References: - Statistical Inference, by S.D. Silvey, Chapman & Hall. - Regression Analysis: Theory, Methods and Applications, by A. Sen and M. Srivastava, Springer. - Density Estimation, by B.W. Silverman, Chapman and Hall. - Nonparametric Simple Regression, by J. Fox, Sage Publications. - Applied Smoothing Techniques for Data Analysis: the Kernel Approach With S-Plus Illustrations, by A.W. Bowman, A. Azzalini, Oxford University Press. - Kernel Smoothing: Principles, Methods and Applications, by S. Ghosh, Wiley. Additional references will be given out in the lectures. | |||||

Prerequisites / Notice | Prerequisites: A background in Linear Algebra, Calculus, Probability & Statistical Inference including Estimation and Testing. | |||||

447-6289-00L | Sampling Surveys Special Students "University of Zurich (UZH)" in the Master Program in Biostatistics at UZH cannot register for this course unit electronically. Forward the lecturer's written permission to attend to the Registrar's Office. Alternatively, the lecturer may also send an email directly to registrar@ethz.ch. The Registrar's Office will then register you for the course. | W | 2 credits | 1G | B. Hulliger | |

Abstract | The elements of a sample survey are explained. The most important classical sample designs (simple random sampling and stratified random sampling) with their estimation procedures and the use of auxiliary information including the Horvitz-Thompson estimator are introduced. Data preparation, non-response and its treatment, variance estimation and analysis of survey data is discussed. | |||||

Objective | Knowledge of the Elements and the process of a sample survey. Understanding of the paradigm of random samples. Knowledge of simple random samplinig and stratified random sampling and capability to apply the corresponding methods. Knowledge of further methods of sampling and estimation as well as data preparation and analysis. | |||||

Lecture notes | Introduction to the statistical methods of survey research | |||||

401-3628-14L | Bayesian StatisticsDoes not take place this semester. | W | 4 credits | 2V | ||

Abstract | Introduction to the Bayesian approach to statistics: decision theory, prior distributions, hierarchical Bayes models, empirical Bayes, Bayesian tests and model selection, empirical Bayes, Laplace approximation, Monte Carlo and Markov chain Monte Carlo methods. | |||||

Objective | Students understand the conceptual ideas behind Bayesian statistics and are familiar with common techniques used in Bayesian data analysis. | |||||

Content | Topics that we will discuss are: Difference between the frequentist and Bayesian approach (decision theory, principles), priors (conjugate priors, noninformative priors, Jeffreys prior), tests and model selection (Bayes factors, hyper-g priors for regression),hierarchical models and empirical Bayes methods, computational methods (Laplace approximation, Monte Carlo and Markov chain Monte Carlo methods) | |||||

Lecture notes | A script will be available in English. | |||||

Literature | Christian Robert, The Bayesian Choice, 2nd edition, Springer 2007. A. Gelman et al., Bayesian Data Analysis, 3rd edition, Chapman & Hall (2013). Additional references will be given in the course. | |||||

Prerequisites / Notice | Familiarity with basic concepts of frequentist statistics and with basic concepts of probability theory (random variables, joint and conditional distributions, laws of large numbers and central limit theorem) will be assumed. | |||||

447-6273-00L | Bayes Methods Special Students "University of Zurich (UZH)" in the Master Program in Biostatistics at UZH cannot register for this course unit electronically. Forward the lecturer's written permission to attend to the Registrar's Office. Alternatively, the lecturer may also send an email directly to registrar@ethz.ch. The Registrar's Office will then register you for the course. | W | 2 credits | 2G | Y.‑L. Grize | |

Abstract | conditional probability; bayes inference (conjugate distributions, HPD-areas; linear and empirical bayes); determination of the a-posteriori distribution through simulation (MCMC with R2Winbugs); introduction to multilevel/hierarchical models. | |||||

Objective | ||||||

Content | Bayes statistics is attractive, because it allows to make decisions under uncertainty where a classical frequentist statistical approach fails. The course provides an introduction into bayesian methods. It is moderately mathematically technical, but demands a flexibility of mind, which should not underestimated. | |||||

Literature | Gelman A., Carlin J.B., Stern H.S. and D.B. Rubin, Bayesian Data Analysis, Chapman and Hall, 2nd Edition, 2004. Kruschke, J.K., Doing Bayesian Data Analysis, Elsevier2011. | |||||

Prerequisites / Notice | Prerequisite:Basic knowledge of statistics; Knowledge of R. | |||||

401-3901-00L | Mathematical Optimization | W | 11 credits | 4V + 2U | R. Zenklusen | |

Abstract | Mathematical treatment of diverse optimization techniques. | |||||

Objective | The goal of this course is to get a thorough understanding of various classical mathematical optimization techniques with an emphasis on polyhedral approaches. In particular, we want students to develop a good understanding of some important problem classes in the field, of structural mathematical results linked to these problems, and of solution approaches based on this structural understanding. | |||||

Content | Key topics include: - Linear programming and polyhedra; - Flows and cuts; - Combinatorial optimization problems and techniques; - Equivalence between optimization and separation; - Brief introduction to Integer Programming. | |||||

Literature | - Bernhard Korte, Jens Vygen: Combinatorial Optimization. 6th edition, Springer, 2018. - Alexander Schrijver: Combinatorial Optimization: Polyhedra and Efficiency. Springer, 2003. This work has 3 volumes. - Ravindra K. Ahuja, Thomas L. Magnanti, James B. Orlin. Network Flows: Theory, Algorithms, and Applications. Prentice Hall, 1993. - Alexander Schrijver: Theory of Linear and Integer Programming. John Wiley, 1986. | |||||

Prerequisites / Notice | Solid background in linear algebra. | |||||

252-0535-00L | Advanced Machine Learning | W | 10 credits | 3V + 2U + 4A | J. M. Buhmann, C. Cotrini Jimenez | |

Abstract | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||

Objective | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||

Content | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||

Lecture notes | No lecture notes, but slides will be made available on the course webpage. | |||||

Literature | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||

Prerequisites / Notice | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. | |||||

252-3005-00L | Natural Language Processing Number of participants limited to 200. | W | 5 credits | 2V + 1U + 1A | R. Cotterell | |

Abstract | This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||

Objective | The objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques. | |||||

Content | This course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||

Literature | Jacob Eisenstein: Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series) | |||||

227-0423-00L | Neural Network Theory | W | 4 credits | 2V + 1U | H. Bölcskei | |

Abstract | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, basics of approximation theory, fundamental limits of deep neural network learning, geometry of decision surfaces, capacity of separating surfaces, dimension measures relevant for generalization, VC dimension of neural networks. | |||||

Objective | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of (deep) neural networks. | |||||

Content | 1. Universal approximation with single- and multi-layer networks 2. Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory 3. Fundamental limits of deep neural network learning 4. Geometry of decision surfaces 5. Separating capacity of nonlinear decision surfaces 6. Dimension measures: Pseudo-dimension, fat-shattering dimension, Vapnik-Chervonenkis (VC) dimension 7. Dimensions of neural networks 8. Generalization error in neural network learning | |||||

Lecture notes | Detailed lecture notes will be provided. | |||||

Prerequisites / Notice | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. | |||||

401-4521-70L | Geometric Tomography - Uniqueness, Statistical Reconstruction and Algorithms | W | 4 credits | 2V | J. Hörrmann | |

Abstract | Self-contained course on the theoretical aspects of the reconstruction of geometric objects from tomographic projection and section data. | |||||

Objective | Introduction to geometric tomography and understanding of various theoretical aspects of reconstruction problems. | |||||

Content | The problem of reconstruction of an object from geometric information like X-ray data is a classical inverse problem on the overlap between applied mathematics, statistics, computer science and electrical engineering. We focus on various aspects of the problem in the case of prior shape information on the reconstruction object. We will answer questions on uniqueness of the reconstruction and also cover statistical and algorithmic aspects. | |||||

Literature | R. Gardner: Geometric Tomography F. Natterer: The Mathematics of Computerized Tomography A. Rieder: Keine Probleme mit inversen Problemen | |||||

Prerequisites / Notice | A sound mathematical background in geometry, analysis and probability is required though a repetition of relevant material will be included. The ability to understand and write mathematical proofs is mandatory. | |||||

401-6282-00L | Statistical Analysis of High-Throughput Genomic and Transcriptomic Data (University of Zurich)No enrolment to this course at ETH Zurich. Book the corresponding module directly at UZH. UZH Module Code: STA426 Mind the enrolment deadlines at UZH: https://www.uzh.ch/cmsssl/en/studies/application/mobilitaet.html | W | 5 credits | 3G | H. Rehrauer, M. Robinson | |

Abstract | A range of topics will be covered, including basic molecular biology, genomics technologies and in particular, a wide range of statistical and computational methods that have been used in the analysis of DNA microarray and high throughput sequencing experiments. | |||||

Objective | -Understand the fundamental "scientific process" in the field of Statistical Bioinformatics -Be equipped with the skills/tools to preprocess genomic data (Unix, Bioconductor, mapping, etc.) and ensure reproducible research (Sweave) -Have a general knowledge of the types of data and biological applications encountered with microarray and sequencing data -Have the general knowledge of the range of statistical methods that get used with microarray and sequencing data -Gain the ability to apply statistical methods/knowledge/software to a collaborative biological project -Gain the ability to critical assess the statistical bioinformatics literature -Write a coherent summary of a bioinformatics problem and its solution in statistical terms | |||||

Content | Lectures will include: microarray preprocessing; normalization; exploratory data analysis techniques such as clustering, PCA and multidimensional scaling; Controlling error rates of statistical tests (FPR versus FDR versus FWER); limma (linear models for microarray analysis); mapping algorithms (for RNA/ChIP-seq); RNA-seq quantification; statistical analyses for differential count data; isoform switching; epigenomics data including DNA methylation; gene set analyses; classification | |||||

Lecture notes | Lecture notes, published manuscripts | |||||

Prerequisites / Notice | Prerequisites: Basic knowlegde of the programming language R, sufficient knowledge in statistics Former course title: Statistical Methods for the Analysis of Microarray and Short-Read Sequencing Data | |||||

401-8625-00L | Clinical Biostatistics (University of Zurich)No enrolment to this course at ETH Zurich. Book the corresponding module directly at UZH. UZH Module Code: STA404 Mind the enrolment deadlines at UZH: https://www.uzh.ch/cmsssl/en/studies/application/mobilitaet.html | W | 6 credits | 4G | University lecturers | |

Abstract | Discussion of the different statistical methods that are used in clinical research. | |||||

Objective | ||||||

Content | Discussion of the different statistical methods that are used in clinical research. Among other subjects the following will be introduced: sample size calculation, randomization and blinding, analysis of clinical trials (parallel groups design, analysis of covariance, crossover design, equivalence studies), intention-to-treat analysis, multiple testing, group sequential methods, adaptive designs, diagnostic studies, and agreement studies. | |||||

Literature | - Matthews, J. N. S. (2006). Introduction to Randomized Controlled Clinical Trials. Chapman & Hall/CRC Texts in Statistical Science. - Cook, T. D. and DeMets, L. D. (2008). Introduction to Statistical Methods for Clinical Trials. Chapman & Hall/CRC Texts in Statistical Science. - Pepe, M. (2003). The Statistical Evaluation of Medical Tests for Classification and Prediction. Oxford University Press. - Schumacher, M. and Schulgen, G. (2008). Methodik klinischer Studien. Springer, Berlin. | |||||

Prerequisites / Notice | Basic knowlegde of the programming language R, sufficient knowledge in calculus, linear algebra, probability, statistics | |||||

263-3210-00L | Deep Learning | W | 8 credits | 3V + 2U + 2A | T. Hofmann | |

Abstract | Deep learning is an area within machine learning that deals with algorithms and models that automatically induce multi-level data representations. | |||||

Objective | In recent years, deep learning and deep networks have significantly improved the state-of-the-art in many application domains such as computer vision, speech recognition, and natural language processing. This class will cover the mathematical foundations of deep learning and provide insights into model design, training, and validation. The main objective is a profound understanding of why these methods work and how. There will also be a rich set of hands-on tasks and practical projects to familiarize students with this emerging technology. | |||||

Prerequisites / Notice | This is an advanced level course that requires some basic background in machine learning. More importantly, students are expected to have a very solid mathematical foundation, including linear algebra, multivariate calculus, and probability. The course will make heavy use of mathematics and is not (!) meant to be an extended tutorial of how to train deep networks with tools like Torch or Tensorflow, although that may be a side benefit. The participation in the course is subject to the following condition: - Students must have taken the exam in Advanced Machine Learning (252-0535-00) or have acquired equivalent knowledge, see exhaustive list below: Advanced Machine Learning https://ml2.inf.ethz.ch/courses/aml/ Computational Intelligence Lab http://da.inf.ethz.ch/teaching/2019/CIL/ Introduction to Machine Learning https://las.inf.ethz.ch/teaching/introml-S19 Statistical Learning Theory http://ml2.inf.ethz.ch/courses/slt/ Computational Statistics https://stat.ethz.ch/lectures/ss19/comp-stats.php Probabilistic Artificial Intelligence https://las.inf.ethz.ch/teaching/pai-f18 | |||||

263-5210-00L | Probabilistic Artificial Intelligence | W | 8 credits | 3V + 2U + 2A | A. Krause | |

Abstract | This course introduces core modeling techniques and algorithms from machine learning, optimization and control for reasoning and decision making under uncertainty, and study applications in areas such as robotics and the Internet. | |||||

Objective | How can we build systems that perform well in uncertain environments and unforeseen situations? How can we develop systems that exhibit "intelligent" behavior, without prescribing explicit rules? How can we build systems that learn from experience in order to improve their performance? We will study core modeling techniques and algorithms from statistics, optimization, planning, and control and study applications in areas such as sensor networks, robotics, and the Internet. The course is designed for graduate students. | |||||

Content | Topics covered: - Probability - Probabilistic inference (variational inference, MCMC) - Bayesian learning (Gaussian processes, Bayesian deep learning) - Probabilistic planning (MDPs, POMPDPs) - Multi-armed bandits and Bayesian optimization - Reinforcement learning | |||||

Prerequisites / Notice | Solid basic knowledge in statistics, algorithms and programming. The material covered in the course "Introduction to Machine Learning" is considered as a prerequisite. |

- Page 1 of 1