Suchergebnis: Katalogdaten im Frühjahrssemester 2020

Informatik Master Information
Vertiefungsfächer
Vertiefung in Information Systems
Wahlfächer der Vertiefung in Information Systems
NummerTitelTypECTSUmfangDozierende
252-0312-00LUbiquitous Computing Information W4 KP2V + 1AC. Holz, F. Mattern, S. Mayer
KurzbeschreibungUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
LernzielUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
SkriptCopies of slides will be made available
LiteraturWill be provided in the lecture. To put you in the mood:
Mark Weiser: The Computer for the 21st Century. Scientific American, September 1991, pp. 94-104
252-0526-00LStatistical Learning Theory Information W7 KP3V + 2U + 1AJ. M. Buhmann, C. Cotrini Jimenez
KurzbeschreibungThe course covers advanced methods of statistical learning:

- Variational methods and optimization.
- Deterministic annealing.
- Clustering for diverse types of data.
- Model validation by information theory.
LernzielThe course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning.
Inhalt- Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing.

- Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures.

- Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation.

- Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models.
SkriptA draft of a script will be provided. Lecture slides will be made available.
LiteraturHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Voraussetzungen / BesonderesKnowledge of machine learning (introduction to machine learning and/or advanced machine learning)
Basic knowledge of statistics.
252-3005-00LNatural Language Understanding Information
Findet dieses Semester nicht statt.
Findet im HS20 wieder statt.
W5 KP2V + 1U + 1ANoch nicht bekannt
KurzbeschreibungThis course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
LernzielThe objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques.
InhaltThis course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
LiteraturLectures will make use of textbooks such as the one by Jurafsky and Martin where appropriate, but will also make use of original research and survey papers.
263-5300-00LGuarantees for Machine Learning Information Belegung eingeschränkt - Details anzeigen W5 KP2V + 2AF. Yang
KurzbeschreibungThis course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning.
LernzielThis course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience.
InhaltThis course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in

- concentration bounds, uniform convergence
- high-dimensional statistics (e.g. Lasso)
- prediction error bounds for non-parametric statistics (e.g. in kernel spaces)
- minimax lower bounds
- regularization via optimization

The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to

- how overparameterization could help generalization ( interpolating models, linearized NN )
- how overparameterization could help optimization ( non-convex optimization, loss landscape )
- complexity measures and approximation theoretic properties of randomly initialized and
trained NN
- generalization of robust learning ( adversarial robustness, standard and robust error tradeoff )
- prediction with calibrated confidence ( conformal prediction, calibration )
Voraussetzungen / BesonderesIt’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs.
  •  Seite  1  von  1