David Steurer: Catalogue data in Spring Semester 2019 |
Name | Prof. Dr. David Steurer |
Field | Theoretical Computer Science |
Address | Professur Theoretische Informatik ETH Zürich, OAT Z 22.2 Andreasstrasse 5 8092 Zürich SWITZERLAND |
david.steurer@inf.ethz.ch | |
URL | https://www.dsteurer.org |
Department | Computer Science |
Relationship | Associate Professor |
Number | Title | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|
252-4202-00L | Seminar in Theoretical Computer Science The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | 2 credits | 2S | A. Steger, B. Gärtner, M. Ghaffari, M. Hoffmann, J. Lengler, D. Steurer, B. Sudakov | |
Abstract | Presentation of recent publications in theoretical computer science, including results by diploma, masters and doctoral candidates. | ||||
Learning objective | To get an overview of current research in the areas covered by the involved research groups. To present results from the literature. | ||||
Prerequisites / Notice | This seminar takes place as part of the joint research seminar of several theory groups. Intended participation is for students with excellent performance only. Formal minimal requirement is passing of one of the courses Algorithms, Probability, and Computing, Randomized Algorithms and Probabilistic Methods, Geometry: Combinatorics and Algorithms, Advanced Algorithms. (If you cannot fulfill this restriction, because this is your first term at ETH, but you believe that you satisfy equivalent criteria, please send an email with a detailed description of your reasoning to the organizers of the seminar.) | ||||
261-5110-00L | Optimization for Data Science | 8 credits | 3V + 2U + 2A | B. Gärtner, D. Steurer | |
Abstract | This course teaches an overview of modern optimization methods, with applications in particular for machine learning and data science. | ||||
Learning objective | Understanding the theoretical and practical aspects of relevant optimization methods used in data science. Learning general paradigms to deal with optimization problems arising in data science. | ||||
Content | This course teaches an overview of modern optimization methods, with applications in particular for machine learning and data science. In the first part of the course, we will discuss how classical first and second order methods such as gradient descent and Newton's method can be adapated to scale to large datasets, in theory and in practice. We also cover some new algorithms and paradigms that have been developed specifically in the context of data science. The emphasis is not so much on the application of these methods (many of which are covered in other courses), but on understanding and analyzing the methods themselves. In the second part, we discuss convex programming relaxations as a powerful and versatile paradigm for designing efficient algorithms to solve computational problems arising in data science. We will learn about this paradigm and develop a unified perspective on it through the lens of the sum-of-squares semidefinite programming hierarchy. As applications, we are discussing non-negative matrix factorization, compressed sensing and sparse linear regression, matrix completion and phase retrieval, as well as robust estimation. | ||||
Prerequisites / Notice | As background, we require material taught in the course "252-0209-00L Algorithms, Probability, and Computing". It is not necessary that participants have actually taken the course, but they should be prepared to catch up if necessary. | ||||
263-4110-00L | Interdisciplinary Algorithms Lab Does not take place this semester. In the Master Programme max. 10 credits can be accounted by Labs on top of the Interfocus Courses. Additional Labs will be listed on the Addendum. | 5 credits | 2P | A. Steger, D. Steurer | |
Abstract | In this course students will develop solutions for algorithmic problems posed by researchers from other fields. | ||||
Learning objective | Students will learn that in order to tackle algorithmic problems from an interdisciplinary or applied context one needs to combine a solid understanding of algorithmic methodology with insights into the problem at hand to judge which side constraints are essential and which can be loosened. | ||||
Prerequisites / Notice | Students will work in teams. Ideally, skills of team members complement each other. Interested Bachelor students can apply for participation by sending an email to steger@inf.ethz.ch explaining motivation and transcripts. |