Name | Herr Prof. Dr. Andreas Krause |

Lehrgebiet | Informatik |

Adresse | Institut für Maschinelles Lernen ETH Zürich, OAT Y 13.1 Andreasstrasse 5 8092 Zürich SWITZERLAND |

Telefon | +41 44 632 63 22 |

Fax | +41 44 623 15 62 |

krausea@ethz.ch | |

URL | http://las.ethz.ch/krausea.html |

Departement | Informatik |

Beziehung | Ordentlicher Professor |

Nummer | Titel | ECTS | Umfang | Dozierende | |
---|---|---|---|---|---|

252-0220-00L | Introduction to Machine Learning Previously called Learning and Intelligent Systems Prof. Krause approves that students take distance exams, also if the exam will take place at a later time due to a different time zone of the alternative exam place. To get Prof. Krause's signature on the distance exam form please send it to Rita Klute, rita.klute@inf.ethz.ch. | 8 KP | 4V + 2U + 1A | A. Krause | |

Kurzbeschreibung | The course introduces the foundations of learning and making predictions based on data. | ||||

Lernziel | The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. | ||||

Inhalt | - Linear regression (overfitting, cross-validation/bootstrap, model selection, regularization, [stochastic] gradient descent) - Linear classification: Logistic regression (feature selection, sparsity, multi-class) - Kernels and the kernel trick (Properties of kernels; applications to linear and logistic regression; k-NN - The statistical perspective (regularization as prior; loss as likelihood; learning as MAP inference) - Statistical decision theory (decision making based on statistical models and utility functions) - Discriminative vs. generative modeling (benefits and challenges in modeling joint vy. conditional distributions) - Bayes' classifiers (Naive Bayes, Gaussian Bayes; MLE) - Bayesian networks and exact inference (conditional independence; variable elimination; TANs) - Approximate inference (sum/max product; Gibbs sampling) - Latent variable models (Gaussian Misture Models, EM Algorithm) - Temporal models (Bayesian filtering, Hidden Markov Models) - Sequential decision making (MDPs, value and policy iteration) - Reinforcement learning (model-based RL, Q-learning) | ||||

Literatur | Textbook: Kevin Murphy: A Probabilistic Perspective, MIT Press | ||||

Voraussetzungen / Besonderes | Designed to provide basis for following courses: - Advanced Machine Learning - Data Mining: Learning from Large Data Sets - Probabilistic Artificial Intelligence - Probabilistic Graphical Models - Seminar "Advanced Topics in Machine Learning" | ||||

252-0945-06L | Doctoral Seminar Machine Learning (FS18) Only for Computer Science Ph.D. students. This doctoral seminar is intended for PhD students affiliated with the Instutute for Machine Learning. Other PhD students who work on machine learning projects or related topics need approval by at least one of the organizers to register for the seminar. | 2 KP | 2S | J. M. Buhmann, T. Hofmann, A. Krause, G. Rätsch | |

Kurzbeschreibung | An essential aspect of any research project is dissemination of the findings arising from the study. Here we focus on oral communication, which includes: appropriate selection of material, preparation of the visual aids (slides and/or posters), and presentation skills. | ||||

Lernziel | The seminar participants should learn how to prepare and deliver scientific talks as well as to deal with technical questions. Participants are also expected to actively contribute to discussions during presentations by others, thus learning and practicing critical thinking skills. | ||||

Voraussetzungen / Besonderes | This doctoral seminar is intended for PhD students affiliated with the Instutute for Machine Learning. Other PhD students who work on machine learning projects or related topics need approval by at least one of the organizers to register for the seminar. |