Search result: Catalogue data in Spring Semester 2020
Cyber Security Master | ||||||
Minor | ||||||
Computational Science | ||||||
Core Courses | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|
401-3632-00L | Computational Statistics | W | 8 credits | 3V + 1U | M. H. Maathuis | |
Abstract | We discuss modern statistical methods for data analysis, including methods for data exploration, prediction and inference. We pay attention to algorithmic aspects, theoretical properties and practical considerations. The class is hands-on and methods are applied using the statistical programming language R. | |||||
Learning objective | The student obtains an overview of modern statistical methods for data analysis, including their algorithmic aspects and theoretical properties. The methods are applied using the statistical programming language R. | |||||
Content | See the class website | |||||
Prerequisites / Notice | At least one semester of (basic) probability and statistics. Programming experience is helpful but not required. | |||||
Electives | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
252-0526-00L | Statistical Learning Theory | W | 7 credits | 3V + 2U + 1A | J. M. Buhmann, C. Cotrini Jimenez | |
Abstract | The course covers advanced methods of statistical learning: - Variational methods and optimization. - Deterministic annealing. - Clustering for diverse types of data. - Model validation by information theory. | |||||
Learning objective | The course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning. | |||||
Content | - Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing. - Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures. - Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation. - Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models. | |||||
Lecture notes | A draft of a script will be provided. Lecture slides will be made available. | |||||
Literature | Hastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001. L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996 | |||||
Prerequisites / Notice | Knowledge of machine learning (introduction to machine learning and/or advanced machine learning) Basic knowledge of statistics. | |||||
261-5120-00L | Machine Learning for Health Care Number of participants limited to 150. | W | 5 credits | 3P + 1A | G. Rätsch, J. Vogt, V. Boeva | |
Abstract | The course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems. | |||||
Learning objective | During the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions. | |||||
Content | The course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine: 1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc. 2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them. 3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them. 4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges. | |||||
Prerequisites / Notice | Data Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II. | |||||
263-5300-00L | Guarantees for Machine Learning | W | 5 credits | 2V + 2A | F. Yang | |
Abstract | This course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning. | |||||
Learning objective | This course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience. | |||||
Content | This course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in - concentration bounds, uniform convergence - high-dimensional statistics (e.g. Lasso) - prediction error bounds for non-parametric statistics (e.g. in kernel spaces) - minimax lower bounds - regularization via optimization The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to - how overparameterization could help generalization ( interpolating models, linearized NN ) - how overparameterization could help optimization ( non-convex optimization, loss landscape ) - complexity measures and approximation theoretic properties of randomly initialized and trained NN - generalization of robust learning ( adversarial robustness, standard and robust error tradeoff ) - prediction with calibrated confidence ( conformal prediction, calibration ) | |||||
Prerequisites / Notice | It’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs. |
- Page 1 of 1