Niao He: Katalogdaten im Frühjahrssemester 2022 |
Name | Frau Prof. Dr. Niao He |
Lehrgebiet | Informatik |
Adresse | Professur für Informatik ETH Zürich, OAT Y 21.1 Andreasstrasse 5 8092 Zürich SWITZERLAND |
niao.he@inf.ethz.ch | |
URL | https://odi.inf.ethz.ch/ |
Departement | Informatik |
Beziehung | Assistenzprofessorin (Tenure Track) |
Nummer | Titel | ECTS | Umfang | Dozierende | |
---|---|---|---|---|---|
252-0945-14L | Doctoral Seminar Machine Learning (FS22) Only for Computer Science Ph.D. students. This doctoral seminar is intended for PhD students affiliated with the Institute for Machine Learning. Other PhD students who work on machine learning projects or related topics need approval by at least one of the organizers to register for the seminar. | 2 KP | 1S | N. He, M. Sachan, A. Krause, G. Rätsch | |
Kurzbeschreibung | An essential aspect of any research project is dissemination of the findings arising from the study. Here we focus on oral communication, which includes: appropriate selection of material, preparation of the visual aids (slides and/or posters), and presentation skills. | ||||
Lernziel | The seminar participants should learn how to prepare and deliver scientific talks as well as to deal with technical questions. Participants are also expected to actively contribute to discussions during presentations by others, thus learning and practicing critical thinking skills. | ||||
Voraussetzungen / Besonderes | This doctoral seminar of the Machine Learning Laboratory of ETH is intended for PhD students who work on a machine learning project, i.e., for the PhD students of the ML lab. | ||||
261-5110-00L | Optimization for Data Science | 10 KP | 3V + 2U + 4A | B. Gärtner, N. He | |
Kurzbeschreibung | This course provides an in-depth theoretical treatment of optimization methods that are relevant in data science. | ||||
Lernziel | Understanding the guarantees and limits of relevant optimization methods used in data science. Learning theoretical paradigms and techniques to deal with optimization problems arising in data science. | ||||
Inhalt | This course provides an in-depth theoretical treatment of classical and modern optimization methods that are relevant in data science. After a general discussion about the role that optimization has in the process of learning from data, we give an introduction to the theory of (convex) optimization. Based on this, we present and analyze algorithms in the following four categories: first-order methods (gradient and coordinate descent, Frank-Wolfe, subgradient and mirror descent, stochastic and incremental gradient methods); second-order methods (Newton and quasi Newton methods); non-convexity (local convergence, provable global convergence, cone programming, convex relaxations); min-max optimization (extragradient methods). The emphasis is on the motivations and design principles behind the algorithms, on provable performance bounds, and on the mathematical tools and techniques to prove them. The goal is to equip students with a fundamental understanding about why optimization algorithms work, and what their limits are. This understanding will be of help in selecting suitable algorithms in a given application, but providing concrete practical guidance is not our focus. | ||||
Voraussetzungen / Besonderes | A solid background in analysis and linear algebra; some background in theoretical computer science (computational complexity, analysis of algorithms); the ability to understand and write mathematical proofs. |