401-3619-69L Mathematics Tools in Machine Learning
Semester | Autumn Semester 2019 |
Lecturers | F. Balabdaoui |
Periodicity | non-recurring course |
Language of instruction | English |
Courses
Number | Title | Hours | Lecturers | ||||
---|---|---|---|---|---|---|---|
401-3619-69 G | Mathematics Tools in Machine Learning | 2 hrs |
| F. Balabdaoui |
Catalogue data
Abstract | The course reviews many essential mathematical tools used in statistical learning. The lectures will cover the notions of hypotheses classes, sample complexity, PAC learnability, model validation and selection as well as results on several well-known algorithms and their convergence. |
Objective | In the exploding world of artifical intelligence and automated learning, there is an urgent need to go back to the basis of what is driving many of the well-establsihed methods in statistical learning. The students attending the lectures will get acquainted with the main theoretical results needed to establish the theory of statistical learning. We start with defining what is meant by learning a task, a training sample, the trade-off between choosing a big class of functions (hypotheses) to learn the task and the difficulty of estimating the unknown function (generating the observed sample). The course will also cover the notion of learnability and the conditions under which it is possible to learn a task. In a second part, the lectures will cover algoritmic apsects where some well-known algorithms will be described and their convergence proved. Through the exerices classes, the students will deepen their understanding using their knowledge of the learned theory on some new situations, examples or some counterexamples. |
Content | The course will cover the following subjects: (*) Definition of Learning and Formal Learning Models (*) Uniform Convergence (*) Linear Predictors (*) The Bias-Complexity Trade-off (*) VC-classes and the VC dimension (*) Model Selection and Validation (*) Convex Learning Problems (*) Regularization and Stability (*) Stochastic Gradient Descent (*) Support Vector Machines (*) Kernels |
Literature | The course will be based on the book "Understanding Machine Learning: From Theory to Algorithms" by S. Shalev-Shwartz and S. Ben-David, which is available online through the ETH electronic library. Other good sources can be also read. This includes (*) the book "Neural Network Learning: Theoretical Foundations" de Martin Anthony and Peter L. Bartlett. This book can be borrowed from the ETH library. (*) the lectures notes on "Mathematics of Machine Learning" taught by Philippe Rigollet available through the OpenCourseWare website of MIT |
Prerequisites / Notice | Being able to follow the lectures requires a solid background in Probability Theory and Mathematical Statistical. Notions in computations, convergence of algorithms can be helpful but are not required. |
Performance assessment
Performance assessment information (valid until the course unit is held again) | |
![]() | |
ECTS credits | 4 credits |
Examiners | F. Balabdaoui |
Type | session examination |
Language of examination | English |
Repetition | The performance assessment is offered every session. Repetition possible without re-enrolling for the course unit. |
Mode of examination | written 120 minutes |
Additional information on mode of examination | The exam is offered only in the examination sessions Winter 2020 and Summer 2020. |
Written aids | None |
This information can be updated until the beginning of the semester; information on the examination timetable is binding. |
Learning materials
No public learning materials available. | |
Only public learning materials are listed. |
Groups
No information on groups available. |
Restrictions
There are no additional restrictions for the registration. |
Offered in
Programme | Section | Type | |
---|---|---|---|
Doctoral Department of Mathematics | Graduate School | W | ![]() |
Mathematics Master | Selection: Probability Theory, Statistics | W | ![]() |
Statistics Master | Statistical and Mathematical Courses | W | ![]() |