David Steurer: Katalogdaten im Frühjahrssemester 2019
|Name||Herr Prof. Dr. David Steurer|
Professur Theoretische Informatik
ETH Zürich, OAT Z 22.2
|252-4202-00L||Seminar in Theoretical Computer Science |
The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar.
|2 KP||2S||A. Steger, B. Gärtner, M. Ghaffari, M. Hoffmann, J. Lengler, D. Steurer, B. Sudakov|
|Kurzbeschreibung||Presentation of recent publications in theoretical computer science, including results by diploma, masters and doctoral candidates.|
|Lernziel||To get an overview of current research in the areas covered by the involved research groups. To present results from the literature.|
|Voraussetzungen / Besonderes||This seminar takes place as part of the joint research seminar of several theory groups. Intended participation is for students with excellent performance only. Formal minimal requirement is passing of one of the courses Algorithms, Probability, and Computing, Randomized Algorithms and Probabilistic Methods, Geometry: Combinatorics and Algorithms, Advanced Algorithms. (If you cannot fulfill this restriction, because this is your first term at ETH, but you believe that you satisfy equivalent criteria, please send an email with a detailed description of your reasoning to the organizers of the seminar.)|
|261-5110-00L||Optimization for Data Science||8 KP||3V + 2U + 2A||B. Gärtner, D. Steurer|
|Kurzbeschreibung||This course teaches an overview of modern optimization methods, with applications in particular for machine learning and data science.|
|Lernziel||Understanding the theoretical and practical aspects of relevant optimization methods used in data science. Learning general paradigms to deal with optimization problems arising in data science.|
|Inhalt||This course teaches an overview of modern optimization methods, with applications in particular for machine learning and data science.|
In the first part of the course, we will discuss how classical first and second order methods such as gradient descent and Newton's method can be adapated to scale to large datasets, in theory and in practice. We also cover some new algorithms and paradigms that have been developed specifically in the context of data science. The emphasis is not so much on the application of these methods (many of which are covered in other courses), but on understanding and analyzing the methods themselves.
In the second part, we discuss convex programming relaxations as a powerful and versatile paradigm for designing efficient algorithms to solve computational problems arising in data science. We will learn about this paradigm and develop a unified perspective on it through the lens of the sum-of-squares semidefinite programming hierarchy. As applications, we are discussing non-negative matrix factorization, compressed sensing and sparse linear regression, matrix completion and phase retrieval, as well as robust estimation.
|Voraussetzungen / Besonderes||As background, we require material taught in the course "252-0209-00L Algorithms, Probability, and Computing". It is not necessary that participants have actually taken the course, but they should be prepared to catch up if necessary.|
|263-4110-00L||Interdisciplinary Algorithms Lab |
Findet dieses Semester nicht statt.
Im Masterstudium können zusätzlich zu den Vertiefungsübergreifenden Fächern nur max. 10 KP mit Laboratorien erarbeitet werden. Weitere Labs werden auf dem Beiblatt aufgeführt.
|5 KP||2P||A. Steger, D. Steurer|
|Kurzbeschreibung||In this course students will develop solutions for algorithmic problems posed by researchers from other fields.|
|Lernziel||Students will learn that in order to tackle algorithmic problems from an interdisciplinary or applied context one needs to combine a solid understanding of algorithmic methodology with insights into the problem at hand to judge which side constraints are essential and which can be loosened.|
|Voraussetzungen / Besonderes||Students will work in teams. Ideally, skills of team members complement each other. |
Interested Bachelor students can apply for participation by sending an email to firstname.lastname@example.org explaining motivation and transcripts.