Suchergebnis: Katalogdaten im Frühjahrssemester 2020
Informatik Master | ||||||
Vertiefungsfächer | ||||||
Vertiefung in Computational Science | ||||||
Kernfächer der Vertiefung in Computational Science | ||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |
---|---|---|---|---|---|---|
401-3632-00L | Computational Statistics | W | 8 KP | 3V + 1U | M. H. Maathuis | |
Kurzbeschreibung | We discuss modern statistical methods for data analysis, including methods for data exploration, prediction and inference. We pay attention to algorithmic aspects, theoretical properties and practical considerations. The class is hands-on and methods are applied using the statistical programming language R. | |||||
Lernziel | The student obtains an overview of modern statistical methods for data analysis, including their algorithmic aspects and theoretical properties. The methods are applied using the statistical programming language R. | |||||
Voraussetzungen / Besonderes | At least one semester of (basic) probability and statistics. Programming experience is helpful but not required. | |||||
Wahlfächer der Vertiefung in Computational Science | ||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |
252-0526-00L | Statistical Learning Theory | W | 7 KP | 3V + 2U + 1A | J. M. Buhmann, C. Cotrini Jimenez | |
Kurzbeschreibung | The course covers advanced methods of statistical learning: - Variational methods and optimization. - Deterministic annealing. - Clustering for diverse types of data. - Model validation by information theory. | |||||
Lernziel | The course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning. | |||||
Inhalt | - Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing. - Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures. - Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation. - Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models. | |||||
Skript | A draft of a script will be provided. Lecture slides will be made available. | |||||
Literatur | Hastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001. L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996 | |||||
Voraussetzungen / Besonderes | Knowledge of machine learning (introduction to machine learning and/or advanced machine learning) Basic knowledge of statistics. | |||||
261-5120-00L | Machine Learning for Health Care Number of participants limited to 150. | W | 5 KP | 3P + 1A | G. Rätsch, J. Vogt, V. Boeva | |
Kurzbeschreibung | The course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems. | |||||
Lernziel | During the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions. | |||||
Inhalt | The course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine: 1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc. 2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them. 3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them. 4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges. | |||||
Voraussetzungen / Besonderes | Data Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II. | |||||
263-5300-00L | Guarantees for Machine Learning | W | 5 KP | 2V + 2A | F. Yang | |
Kurzbeschreibung | This course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning. | |||||
Lernziel | This course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience. | |||||
Inhalt | This course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in - concentration bounds, uniform convergence - high-dimensional statistics (e.g. Lasso) - prediction error bounds for non-parametric statistics (e.g. in kernel spaces) - minimax lower bounds - regularization via optimization The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to - how overparameterization could help generalization ( interpolating models, linearized NN ) - how overparameterization could help optimization ( non-convex optimization, loss landscape ) - complexity measures and approximation theoretic properties of randomly initialized and trained NN - generalization of robust learning ( adversarial robustness, standard and robust error tradeoff ) - prediction with calibrated confidence ( conformal prediction, calibration ) | |||||
Voraussetzungen / Besonderes | It’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs. | |||||
Seminar in Computational Science | ||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |
252-5704-00L | Advanced Methods in Computer Graphics Number of participants limited to 24. The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 KP | 2S | O. Sorkine Hornung | |
Kurzbeschreibung | This seminar covers advanced topics in computer graphics with a focus on the latest research results. Topics include modeling, rendering, visualization, animation, physical simulation, computational photography, and others. | |||||
Lernziel | The goal is to obtain an in-depth understanding of actual problems and research topics in the field of computer graphics as well as improve presentation and critical analysis skills. | |||||
261-5113-00L | Computational Challenges in Medical Genomics Number of participants limited to 20. | W | 2 KP | 2S | A. Kahles, G. Rätsch | |
Kurzbeschreibung | This seminar discusses recent relevant contributions to the fields of computational genomics, algorithmic bioinformatics, statistical genetics and related areas. Each participant will hold a presentation and lead the subsequent discussion. | |||||
Lernziel | Preparing and holding a scientific presentation in front of peers is a central part of working in the scientific domain. In this seminar, the participants will learn how to efficiently summarize the relevant parts of a scientific publication, critically reflect its contents, and summarize it for presentation to an audience. The necessary skills to succesfully present the key points of existing research work are the same as needed to communicate own research ideas. In addition to holding a presentation, each student will both contribute to as well as lead a discussion section on the topics presented in the class. | |||||
Inhalt | The topics covered in the seminar are related to recent computational challenges that arise from the fields of genomics and biomedicine, including but not limited to genomic variant interpretation, genomic sequence analysis, compressive genomics tasks, single-cell approaches, privacy considerations, statistical frameworks, etc. Both recently published works contributing novel ideas to the areas mentioned above as well as seminal contributions from the past are amongst the list of selected papers. | |||||
Voraussetzungen / Besonderes | Knowledge of algorithms and data structures and interest in applications in genomics and computational biomedicine. |
- Seite 1 von 1