Suchergebnis: Katalogdaten im Herbstsemester 2020
|Master-Studium (Studienreglement 2020)|
|Vertiefung in Machine Intelligence|
|252-0535-00L||Advanced Machine Learning||W||10 KP||3V + 2U + 4A||J. M. Buhmann, C. Cotrini Jimenez|
|Kurzbeschreibung||Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects.|
|Lernziel||Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data.|
|Inhalt||The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data.|
Topics covered in the lecture include:
What is data?
Computational learning theory
Ensembles: Bagging and Boosting
Max Margin methods
Dimensionality reduction techniques
Non-parametric density estimation
Learning Dynamical Systems
|Skript||No lecture notes, but slides will be made available on the course webpage.|
|Literatur||C. Bishop. Pattern Recognition and Machine Learning. Springer 2007.|
R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley &
Sons, second edition, 2001.
T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical
Learning: Data Mining, Inference and Prediction. Springer, 2001.
L. Wasserman. All of Statistics: A Concise Course in Statistical
Inference. Springer, 2004.
|Voraussetzungen / Besonderes||The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments.|
Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution.
PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points.
|263-3210-00L||Deep Learning||W||8 KP||3V + 2U + 2A||T. Hofmann|
|Kurzbeschreibung||Deep learning is an area within machine learning that deals with algorithms and models that automatically induce multi-level data representations.|
|Lernziel||In recent years, deep learning and deep networks have significantly improved the state-of-the-art in many application domains such as computer vision, speech recognition, and natural language processing. This class will cover the mathematical foundations of deep learning and provide insights into model design, training, and validation. The main objective is a profound understanding of why these methods work and how. There will also be a rich set of hands-on tasks and practical projects to familiarize students with this emerging technology.|
|Voraussetzungen / Besonderes||This is an advanced level course that requires some basic background in machine learning. More importantly, students are expected to have a very solid mathematical foundation, including linear algebra, multivariate calculus, and probability. The course will make heavy use of mathematics and is not (!) meant to be an extended tutorial of how to train deep networks with tools like Torch or Tensorflow, although that may be a side benefit.|
The participation in the course is subject to the following condition:
- Students must have taken the exam in Advanced Machine Learning (252-0535-00) or have acquired equivalent knowledge, see exhaustive list below:
Advanced Machine Learning
Computational Intelligence Lab
Introduction to Machine Learning
Statistical Learning Theory
Probabilistic Artificial Intelligence
|263-5210-00L||Probabilistic Artificial Intelligence||W||8 KP||3V + 2U + 2A||A. Krause|
|Kurzbeschreibung||This course introduces core modeling techniques and algorithms from machine learning, optimization and control for reasoning and decision making under uncertainty, and study applications in areas such as robotics and the Internet.|
|Lernziel||How can we build systems that perform well in uncertain environments and unforeseen situations? How can we develop systems that exhibit "intelligent" behavior, without prescribing explicit rules? How can we build systems that learn from experience in order to improve their performance? We will study core modeling techniques and algorithms from statistics, optimization, planning, and control and study applications in areas such as sensor networks, robotics, and the Internet. The course is designed for graduate students.|
- Probabilistic inference (variational inference, MCMC)
- Bayesian learning (Gaussian processes, Bayesian deep learning)
- Probabilistic planning (MDPs, POMPDPs)
- Multi-armed bandits and Bayesian optimization
- Reinforcement learning
|Voraussetzungen / Besonderes||Solid basic knowledge in statistics, algorithms and programming. |
The material covered in the course "Introduction to Machine Learning" is considered as a prerequisite.
|252-3005-00L||Natural Language Processing |
Number of participants limited to 200.
|W||5 KP||2V + 1U + 1A||R. Cotterell|
|Kurzbeschreibung||This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.|
|Lernziel||The objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques.|
|Inhalt||This course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.|
|Literatur||Jacob Eisenstein: Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)|
|261-5100-00L||Computational Biomedicine |
Number of participants limited to 60.
|W||5 KP||2V + 1U + 1A||G. Rätsch, V. Boeva, N. Davidson|
|Kurzbeschreibung||The course critically reviews central problems in Biomedicine and discusses the technical foundations and solutions for these problems.|
|Lernziel||Over the past years, rapid technological advancements have transformed classical disciplines such as biology and medicine into fields of apllied data science. While the sheer amount of the collected data often makes computational approaches inevitable for analysis, it is the domain specific structure and close relation to research and clinic, that call for accurate, robust and efficient algorithms. In this course we will critically review central problems in Biomedicine and will discuss the technical foundations and solutions for these problems.|
|Inhalt||The course will consist of three topic clusters that will cover different aspects of data science problems in Biomedicine: |
1) String algorithms for the efficient representation, search, comparison, composition and compression of large sets of strings, mostly originating from DNA or RNA Sequencing. This includes genome assembly, efficient index data structures for strings and graphs, alignment techniques as well as quantitative approaches.
2) Statistical models and algorithms for the assessment and functional analysis of individual genomic variations. this includes the identification of variants, prediction of functional effects, imputation and integration problems as well as the association with clinical phenotypes.
3) Models for organization and representation of large scale biomedical data. This includes ontolgy concepts, biomedical databases, sequence annotation and data compression.
|Voraussetzungen / Besonderes||Data Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line|
|263-2400-00L||Reliable and Interpretable Artificial Intelligence||W||6 KP||2V + 2U + 1A||M. Vechev|
|Kurzbeschreibung||Creating reliable and explainable probabilistic models is a fundamental challenge to solving the artificial intelligence problem. This course covers some of the latest and most exciting advances that bring us closer to constructing such models.|
|Lernziel||The main objective of this course is to expose students to the latest and most exciting research in the area of explainable and interpretable artificial intelligence, a topic of fundamental and increasing importance. Upon completion of the course, the students should have mastered the underlying methods and be able to apply them to a variety of problems.|
To facilitate deeper understanding, an important part of the course will be a group hands-on programming project where students will build a system based on the learned material.
|Inhalt||The course covers some of the latest research (over the last 2-3 years) underlying the creation of safe, trustworthy, and reliable AI (more information here: https://www.sri.inf.ethz.ch/teaching/riai2020):|
* Adversarial Attacks on Deep Learning (noise-based, geometry attacks, sound attacks, physical attacks, autonomous driving, out-of-distribution)
* Defenses against attacks
* Combining gradient-based optimization with logic for encoding background knowledge
* Complete Certification of deep neural networks via automated reasoning (e.g., via numerical abstractions, mixed-integer solvers).
* Probabilistic certification of deep neural networks
* Training deep neural networks to be provably robust via automated reasoning
* Understanding and Interpreting Deep Networks
* Probabilistic Programming
|Voraussetzungen / Besonderes||While not a formal requirement, the course assumes familiarity with basics of machine learning (especially probability theory, linear algebra, gradient descent, and neural networks). These topics are usually covered in “Intro to ML” classes at most institutions (e.g., “Introduction to Machine Learning” at ETH).|
For solving assignments, some programming experience in Python is excepted.
|263-4500-00L||Advanced Algorithms||W||9 KP||3V + 2U + 3A||M. Ghaffari|
|Kurzbeschreibung||This is a graduate-level course on algorithm design (and analysis). It covers a range of topics and techniques in approximation algorithms, sketching and streaming algorithms, and online algorithms.|
|Lernziel||This course familiarizes the students with some of the main tools and techniques in modern subareas of algorithm design.|
|Inhalt||The lectures will cover a range of topics, tentatively including the following: graph sparsifications while preserving cuts or distances, various approximation algorithms techniques and concepts, metric embeddings and probabilistic tree embeddings, online algorithms, multiplicative weight updates, streaming algorithms, sketching algorithms, and derandomization.|
|Voraussetzungen / Besonderes||This course is designed for masters and doctoral students and it especially targets those interested in theoretical computer science, but it should also be accessible to last-year bachelor students. |
Sufficient comfort with both (A) Algorithm Design & Analysis and (B) Probability & Concentrations. E.g., having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, though not required formally. If you are not sure whether you're ready for this class or not, please consult the instructor.
|263-5902-00L||Computer Vision||W||8 KP||3V + 1U + 3A||M. Pollefeys, S. Tang, V. Ferrari|
|Kurzbeschreibung||The goal of this course is to provide students with a good understanding of computer vision and image analysis techniques. The main concepts and techniques will be studied in depth and practical algorithms and approaches will be discussed and explored through the exercises.|
|Lernziel||The objectives of this course are:|
1. To introduce the fundamental problems of computer vision.
2. To introduce the main concepts and techniques used to solve those.
3. To enable participants to implement solutions for reasonably complex problems.
4. To enable participants to make sense of the computer vision literature.
|Inhalt||Camera models and calibration, invariant features, Multiple-view geometry, Model fitting, Stereo Matching, Segmentation, 2D Shape matching, Shape from Silhouettes, Optical flow, Structure from motion, Tracking, Object recognition, Object category recognition|
|Voraussetzungen / Besonderes||It is recommended that students have taken the Visual Computing lecture or a similar course introducing basic image processing concepts before taking this course.|
- Seite 1 von 1