Search result: Catalogue data in Spring Semester 2021
Mathematics Bachelor | ||||||
Electives | ||||||
Selection: Financial and Insurance Mathematics | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|
401-3923-00L | Selected Topics in Life Insurance Mathematics | W | 4 credits | 2V | M. Koller | |
Abstract | Stochastic Models for Life insurance 1) Markov chains 2) Stochastic Processes for demography and interest rates 3) Cash flow streams and reserves 4) Mathematical Reserves and Thiele's differential equation 5) Theorem of Hattendorff 6) Unit linked policies | |||||
Objective | ||||||
401-3917-00L | Stochastic Loss Reserving Methods | W | 4 credits | 2V | R. Dahms | |
Abstract | Loss Reserving is one of the central topics in non-life insurance. Mathematicians and actuaries need to estimate adequate reserves for liabilities caused by claims. These claims reserves have influence all financial statements, future premiums and solvency margins. We present the stochastics behind various methods that are used in practice to calculate those loss reserves. | |||||
Objective | Our goal is to present the stochastics behind various methods that are used in prctice to estimate claim reserves. These methods enable us to set adequate reserves for liabilities caused by claims and to determine prediction errors of these predictions. | |||||
Content | We will present the following stochastic claims reserving methods/models: - Stochastic Chain-Ladder Method - Bayesian Methods, Bornhuetter-Ferguson Method, Credibility Methods - Distributional Models - Linear Stochastic Reserving Models, with and without inflation - Bootstrap Methods - Claims Development Result (solvency view) - Coupling of portfolios | |||||
Literature | M. V. Wüthrich, M. Merz, Stochastic Claims Reserving Methods in Insurance, Wiley 2008. | |||||
Prerequisites / Notice | The exams ONLY take place during the official ETH examination periods. This course will be held in English and counts towards the diploma "Aktuar SAV". For the latter, see details under Link. Basic knowledge in probability theory is assumed, in particular conditional expectations. | |||||
401-3956-00L | Economic Theory of Financial Markets | W | 4 credits | 2V | M. V. Wüthrich | |
Abstract | This lecture provides an introduction to the economic theory of financial markets. It presents the basic financial and economic concepts to insurance mathematicians and actuaries. | |||||
Objective | This lecture aims at providing the fundamental financial and economic concepts to insurance mathematicians and actuaries. It focuses on portfolio theory, cash flow valuation and deflator techniques. | |||||
Content | We treat the following topics: - Fundamental concepts in economics - Portfolio theory - Mean variance analysis, capital asset pricing model - Arbitrage pricing theory - Cash flow theory - Valuation principles - Stochastic discounting, deflator techniques - Interest rate modeling - Utility theory | |||||
Prerequisites / Notice | The exams ONLY take place during the official ETH examination period. This course will be held in English and counts towards the diploma of "Aktuar SAV". For the latter, see details under Link. Knowledge in probability theory, stochastic processes and statistics is assumed. | |||||
401-3936-00L | Data Analytics for Non-Life Insurance Pricing | W | 4 credits | 2V | C. M. Buser, M. V. Wüthrich | |
Abstract | We study statistical methods in supervised learning for non-life insurance pricing such as generalized linear models, generalized additive models, Bayesian models, neural networks, classification and regression trees, random forests and gradient boosting machines. | |||||
Objective | The student is familiar with classical actuarial pricing methods as well as with modern machine learning methods for insurance pricing and prediction. | |||||
Content | We present the following chapters: - generalized linear models (GLMs) - generalized additive models (GAMs) - neural networks - credibility theory - classification and regression trees (CARTs) - bagging, random forests and boosting | |||||
Lecture notes | The lecture notes are available from: Link | |||||
Prerequisites / Notice | This course will be held in English and counts towards the diploma of "Aktuar SAV". For the latter, see details under Link Good knowledge in probability theory, stochastic processes and statistics is assumed. | |||||
401-4920-00L | Market-Consistent Actuarial Valuation Does not take place this semester. | W | 4 credits | 2V | M. V. Wüthrich | |
Abstract | Introduction to market-consistent actuarial valuation. Topics: Stochastic discounting, full balance sheet approach, valuation portfolio in life and non-life insurance, technical and financial risks, risk management for insurance companies. | |||||
Objective | Goal is to give the basic mathematical tools for describing insurance products within a financial market and economic environment and provide the basics of solvency considerations. | |||||
Content | In this lecture we give a full balance sheet approach to the task of actuarial valuation of an insurance company. Therefore we introduce a multidimensional valuation portfolio (VaPo) on the liability side of the balance sheet. The basis of this multidimensional VaPo is a set of financial instruments. This approach makes the liability side of the balance sheet directly comparable to its asset side. The lecture is based on four sections: 1) Stochastic discounting 2) Construction of a multidimensional Valuation Portfolio for life insurance products (with guarantees) 3) Construction of a multidimensional Valuation Portfolio for a run-off portfolio of a non-life insurance company 4) Measuring financial risks in a full balance sheet approach (ALM risks) | |||||
Literature | Market-Consistent Actuarial Valuation, 3rd edition. Wüthrich, M.V. EAA Series, Springer 2016. ISBN: 978-3-319-46635-4 Wüthrich, M.V., Merz, M. Claims run-off uncertainty: the full picture. SSRN Manuscript ID 2524352 (2015). England, P.D, Verrall, R.J., Wüthrich, M.V. On the lifetime and one-year views of reserve risk, with application to IFRS 17 and Solvency II risk margins. Insurance: Mathematics and Economics 85 (2019), 74-88. Wüthrich, M.V., Embrechts, P., Tsanakas, A. Risk margin for a non-life insurance run-off. Statistics & Risk Modeling 28 (2011), no. 4, 299--317. Financial Modeling, Actuarial Valuation and Solvency in Insurance. Wüthrich, M.V., Merz, M. Springer Finance 2013. ISBN: 978-3-642-31391-2 Cheridito, P., Ery, J., Wüthrich, M.V. Assessing asset-liability risk with neural networks. Risks 8/1 (2020), article 16. | |||||
Prerequisites / Notice | The exams ONLY take place during the official ETH examination period. This course will be held in English and counts towards the diploma of "Aktuar SAV". For the latter, see details under Link. Knowledge in probability theory, stochastic processes and statistics is assumed. | |||||
Selection: Mathematical Physics, Theoretical Physics In the Bachelor's programme in Mathematics 402-0204-00L Electrodynamics is eligible as an elective course, but only if 402-0224-00L Theoretical Physics isn't recognised for credits (neither in the Bachelor's nor in the Master's programme). For the category assignment take contact with the Study Administration Office (Link) after having received the credits. | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
401-2334-00L | Methods of Mathematical Physics II | W | 6 credits | 3V + 2U | T. H. Willwacher | |
Abstract | Group theory: groups, representation of groups, unitary and orthogonal groups, Lorentz group. Lie theory: Lie algebras and Lie groups. Representation theory: representation theory of finite groups, representations of Lie algebras and Lie groups, physical applications (eigenvalue problems with symmetry). | |||||
Objective | ||||||
402-0206-00L | Quantum Mechanics II Special Students UZH must book the module PHY351 directly at UZH. | W | 10 credits | 3V + 2U | P. Jetzer | |
Abstract | Many-body quantum physics rests on symmetry considerations that lead to two kinds of particles, fermions and bosons. Formal techniques include Hartree-Fock theory and second-quantization techniques, as well as quantum statistics with ensembles. Few- and many-body systems include atoms, molecules, the Fermi sea, elastic chains, radiation and its interaction with matter, and ideal quantum gases. | |||||
Objective | Basic command of few- and many-particle physics for fermions and bosons, including second quantisation and quantum statistical techniques. Understanding of elementary many-body systems such as atoms, molecules, the Fermi sea, electromagnetic radiation and its interaction with matter, ideal quantum gases and relativistic theories. | |||||
Content | The description of indistinguishable particles leads us to (exchange-) symmetrized wave functions for fermions and bosons. We discuss simple few-body problems (Helium atoms, hydrogen molecule) und proceed with a systematic description of fermionic many body problems (Hartree-Fock approximation, screening, correlations with applications on atomes and the Fermi sea). The second quantisation formalism allows for the compact description of the Fermi gas, of elastic strings (phonons), and the radiation field (photons). We study the interaction of radiation and matter and the associated phenomena of radiative decay, light scattering, and the Lamb shift. Quantum statistical description of ideal Bose and Fermi gases at finite temperatures concludes the program. If time permits, we will touch upon of relativistic one particle physics, the Klein-Gordon equation for spin-0 bosons and the Dirac equation describing spin-1/2 fermions. | |||||
Literature | G. Baym, Lectures on Quantum Mechanics (Benjamin, Menlo Park, California, 1969) L.I. Schiff, Quantum Mechanics (Mc-Graw-Hill, New York, 1955) A. Messiah, Quantum Mechanics I & II (North-Holland, Amsterdam, 1976) E. Merzbacher, Quantum Mechanics (John Wiley, New York, 1998) C. Cohen-Tannoudji, B. Diu, F. Laloe, Quantum Mechanics I & II (John Wiley, New York, 1977) P.P. Feynman and A.R. Hibbs, Quantum Mechanics and Path Integrals (Mc Graw-Hill, New York, 1965) A.L. Fetter and J.D. Walecka, Theoretical Mechanics of Particles and Continua (Mc Graw-Hill, New York, 1980) J.J. Sakurai, Modern Quantum Mechanics (Addison Wesley, Reading, 1994) J.J. Sakurai, Advanced Quantum mechanics (Addison Wesley) F. Gross, Relativistic Quantum Mechanics and Field Theory (John Wiley, New York, 1993) | |||||
Prerequisites / Notice | Basic knowledge of single-particle Quantum Mechanics | |||||
Selection: Mathematical Optimization, Discrete Mathematics | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
401-3902-21L | Network & Integer Optimization: From Theory to Application | W | 6 credits | 3G | R. Zenklusen | |
Abstract | This course covers various topics in Network and (Mixed-)Integer Optimization. It starts with a rigorous study of algorithmic techniques for some network optimization problems (with a focus on matching problems) and moves to key aspects of how to attack various optimization settings through well-designed (Mixed-)Integer Programming formulations. | |||||
Objective | Our goal is for students to both get a good foundational understanding of some key network algorithms and also to learn how to effectively employ (Mixed-)Integer Programming formulations, techniques, and solvers, to tackle a wide range of discrete optimization problems. | |||||
Content | Key topics include: - Matching problems; - Integer Programming techniques and models; - Extended formulations and strong problem formulations; - Solver techniques for (Mixed-)Integer Programs; - Decomposition approaches. | |||||
Literature | - Bernhard Korte, Jens Vygen: Combinatorial Optimization. 6th edition, Springer, 2018. - Alexander Schrijver: Combinatorial Optimization: Polyhedra and Efficiency. Springer, 2003. This work has 3 volumes. - Vanderbeck François, Wolsey Laurence: Reformulations and Decomposition of Integer Programs. Chapter 13 in: 50 Years of Integer Programming 1958-2008. Springer, 2010. - Alexander Schrijver: Theory of Linear and Integer Programming. John Wiley, 1986. | |||||
Prerequisites / Notice | Solid background in linear algebra. Preliminary knowledge of Linear Programming is ideal but not a strict requirement. Prior attendance of the course Mathematical Optimization is a plus. | |||||
401-3908-21L | Polynomial Optimization | W | 6 credits | 3G | A. A. Kurpisz | |
Abstract | Introduction to Polynomial Optimization and methods to solve its convex relaxations. | |||||
Objective | The goal of this course is to provide a treatment of non-convex Polynomial Optimization problems through the lens of various techniques to solve its convex relaxations. Part of the course will be focused on learning how to apply these techniques to practical examples in finance, robotics and control. | |||||
Content | Key topics include: - Polynomial Optimization as a non-convex optimization problem and its connection to certifying non-negativity of polynomials - Optimization-free and Linear Programming based techniques to approach Polynomial Optimization problems. - Introduction of Second-Order Cone Programming, Semidefinite Programming and Relative Entropy Programming as a tool to solve relaxations of Polynomial Optimization problems. - Applications to optimization problems in finance, robotics and control. | |||||
Lecture notes | A script will be provided. | |||||
Literature | Other helpful materials include: - Jean Bernard Lasserre, An Introduction to Polynomial and Semi-Algebraic Optimization, Cambridge University Press, February 2015 - Pablo Parrilo. 6.972 Algebraic Techniques and Semidefinite Optimization. Spring 2006. Massachusetts Institute of Technology: MIT OpenCourseWare, . License: . | |||||
Prerequisites / Notice | Background in Linear and Integer Programming is recommended. | |||||
Auswahl: Theoretical Computer Science In the Bachelor's programme in Mathematics 401-3052-05L Graph Theory is eligible as an elective course, but only if 401-3052-10L Graph Theory isn't recognised for credits (neither in the Bachelor's nor in the Master's programme). For the category assignment take contact with the Study Administration Office (Link) after having received the credits. | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
252-0408-00L | Cryptographic Protocols | W | 6 credits | 2V + 2U + 1A | M. Hirt, U. Maurer | |
Abstract | The course presents a selection of hot research topics in cryptography. The choice of topics varies and may include provable security, interactive proofs, zero-knowledge protocols, secret sharing, secure multi-party computation, e-voting, etc. | |||||
Objective | Indroduction to a very active research area with many gems and paradoxical results. Spark interest in fundamental problems. | |||||
Content | The course presents a selection of hot research topics in cryptography. The choice of topics varies and may include provable security, interactive proofs, zero-knowledge protocols, secret sharing, secure multi-party computation, e-voting, etc. | |||||
Lecture notes | the lecture notes are in German, but they are not required as the entire course material is documented also in other course material (in english). | |||||
Prerequisites / Notice | A basic understanding of fundamental cryptographic concepts (as taught for example in the course Information Security or in the course Cryptography Foundations) is useful, but not required. | |||||
263-4660-00L | Applied Cryptography Number of participants limited to 150. | W | 8 credits | 3V + 2U + 2P | K. Paterson | |
Abstract | This course will introduce the basic primitives of cryptography, using rigorous syntax and game-based security definitions. The course will show how these primitives can be combined to build cryptographic protocols and systems. | |||||
Objective | The goal of the course is to put students' understanding of cryptography on sound foundations, to enable them to start to build well-designed cryptographic systems, and to expose them to some of the pitfalls that arise when doing so. | |||||
Content | Basic symmetric primitives (block ciphers, modes, hash functions); generic composition; AEAD; basic secure channels; basic public key primitives (encryption,signature, DH key exchange); ECC; randomness; applications. | |||||
Literature | Textbook: Boneh and Shoup, “A Graduate Course in Applied Cryptography”, Link. | |||||
Prerequisites / Notice | Students should have taken the D-INFK Bachelor's course “Information Security" (252-0211-00) or an alternative first course covering cryptography at a similar level. / In this course, we will use Moodle for content delivery: Link. | |||||
Selection: Further Realms | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
401-2684-00L | Mathematics of Machine Learning | W | 5 credits | 2V + 1U | A. Bandeira, N. Zhivotovskii | |
Abstract | Introductory course to Mathematical aspects of Machine Learning, including Supervised Learning, Unsupervised Learning, Sparsity, and Online Learning. | |||||
Objective | Introduction to Mathematical aspects of Machine Learning. | |||||
Content | Mathematical aspects of Supervised Learning, Unsupervised Learning, Sparsity, and Online Learning. This course is a Mathematical course, with Theorems and Proofs. | |||||
Prerequisites / Notice | Note for non Mathematics students: this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs. | |||||
401-4944-20L | Mathematics of Data Science Does not take place this semester. | W | 8 credits | 4G | A. Bandeira | |
Abstract | Mostly self-contained, but fast-paced, introductory masters level course on various theoretical aspects of algorithms that aim to extract information from data. | |||||
Objective | Introduction to various mathematical aspects of Data Science. | |||||
Content | These topics lie in overlaps of (Applied) Mathematics with: Computer Science, Electrical Engineering, Statistics, and/or Operations Research. Each lecture will feature a couple of Mathematical Open Problem(s) related to Data Science. The main mathematical tools used will be Probability and Linear Algebra, and a basic familiarity with these subjects is required. There will also be some (although knowledge of these tools is not assumed) Graph Theory, Representation Theory, Applied Harmonic Analysis, among others. The topics treated will include Dimension reduction, Manifold learning, Sparse recovery, Random Matrices, Approximation Algorithms, Community detection in graphs, and several others. | |||||
Lecture notes | Link | |||||
Prerequisites / Notice | The main mathematical tools used will be Probability, Linear Algebra (and real analysis), and a working knowledge of these subjects is required. In addition to these prerequisites, this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs. We encourage students who are interested in mathematical data science to take both this course and ``227-0434-10L Mathematics of Information'' taught by Prof. H. Bölcskei. The two courses are designed to be complementary. A. Bandeira and H. Bölcskei | |||||
252-0220-00L | Introduction to Machine Learning Limited number of participants. Preference is given to students in programmes in which the course is being offered. All other students will be waitlisted. Please do not contact Prof. Krause for any questions in this regard. If necessary, please contact Link | W | 8 credits | 4V + 2U + 1A | A. Krause, F. Yang | |
Abstract | The course introduces the foundations of learning and making predictions based on data. | |||||
Objective | The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. | |||||
Content | - Linear regression (overfitting, cross-validation/bootstrap, model selection, regularization, [stochastic] gradient descent) - Linear classification: Logistic regression (feature selection, sparsity, multi-class) - Kernels and the kernel trick (Properties of kernels; applications to linear and logistic regression); k-nearest neighbor - Neural networks (backpropagation, regularization, convolutional neural networks) - Unsupervised learning (k-means, PCA, neural network autoencoders) - The statistical perspective (regularization as prior; loss as likelihood; learning as MAP inference) - Statistical decision theory (decision making based on statistical models and utility functions) - Discriminative vs. generative modeling (benefits and challenges in modeling joint vy. conditional distributions) - Bayes' classifiers (Naive Bayes, Gaussian Bayes; MLE) - Bayesian approaches to unsupervised learning (Gaussian mixtures, EM) | |||||
Literature | Textbook: Kevin Murphy, Machine Learning: A Probabilistic Perspective, MIT Press | |||||
Prerequisites / Notice | Designed to provide a basis for following courses: - Advanced Machine Learning - Deep Learning - Probabilistic Artificial Intelligence - Seminar "Advanced Topics in Machine Learning" | |||||
263-5300-00L | Guarantees for Machine Learning Number of participants limited to 30. Last cancellation/deregistration date for this graded semester performance: 17 March 2021! Please note that after that date no deregistration will be accepted and a "no show" will appear on your transcript. | W | 7 credits | 3G + 3A | F. Yang | |
Abstract | This course is aimed at advanced master and doctorate students who want to conduct independent research on theory for modern machine learning (ML). It teaches classical and recent methods in statistical learning theory commonly used to prove theoretical guarantees for ML algorithms. The knowledge is then applied in independent project work that focuses on understanding modern ML phenomena. | |||||
Objective | Learning objectives: - acquire enough mathematical background to understand a good fraction of theory papers published in the typical ML venues. For this purpose, students will learn common mathematical techniques from statistics and optimization in the first part of the course and apply this knowledge in the project work - critically examine recently published work in terms of relevance and determine impactful (novel) research problems. This will be an integral part of the project work and involves experimental as well as theoretical questions - find and outline an approach (some subproblem) to prove a conjectured theorem. This will be practiced in lectures / exercise and homeworks and potentially in the final project. - effectively communicate and present the problem motivation, new insights and results to a technical audience. This will be primarily learned via the final presentation and report as well as during peer-grading of peer talks. | |||||
Content | This course touches upon foundational methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, touching on the following topics - concentration bounds - uniform convergence and empirical process theory - high-dimensional statistics (e.g. sparsity) - regularization for non-parametric statistics (e.g. in RKHS, neural networks) - implicit regularization via gradient descent (e.g. margins, early stopping) - minimax lower bounds The project work focuses on current theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to - how overparameterization could help generalization ( RKHS, NN ) - how overparameterization could help optimization ( non-convex optimization, loss landscape ) - complexity measures and approximation theoretic properties of randomly initialized and trained NN - generalization of robust learning ( adversarial robustness, standard and robust error tradeoff, distribution shift) | |||||
Prerequisites / Notice | It’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. In addition to these prerequisites, this class requires a high degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs. Students have usually taken a subset of Fundamentals of Mathematical Statistics, Probabilistic AI, Neural Network Theory, Optimization for Data Science, Advanced ML, Statistical Learning Theory, Probability Theory (D-MATH) | |||||
227-0432-00L | Learning, Classification and Compression | W | 4 credits | 2V + 1U | E. Riegler | |
Abstract | The focus of the course is aligned to a theoretical approach of learning theory and classification and an introduction to lossy and lossless compression for general sets and measures. We will mainly focus on a probabilistic approach, where an underlying distribution must be learned/compressed. The concepts acquired in the course are of broad and general interest in data sciences. | |||||
Objective | After attending this lecture and participating in the exercise sessions, students will have acquired a working knowledge of learning theory, classification, and compression. | |||||
Content | 1. Learning Theory (a) Framework of Learning (b) Hypothesis Spaces and Target Functions (c) Reproducing Kernel Hilbert Spaces (d) Bias-Variance Tradeoff (e) Estimation of Sample and Approximation Error 2. Classification (a) Binary Classifier (b) Support Vector Machines (separable case) (c) Support Vector Machines (nonseparable case) (d) Kernel Trick 3. Lossy and Lossless Compression (a) Basics of Compression (b) Compressed Sensing for General Sets and Measures (c) Quantization and Rate Distortion Theory for General Sets and Measures | |||||
Lecture notes | Detailed lecture notes will be provided. | |||||
Prerequisites / Notice | This course is aimed at students with a solid background in measure theory and linear algebra and basic knowledge in functional analysis. | |||||
401-3502-21L | Reading Course To start an individual reading course, contact an authorised supervisor Link and register your reading course in myStudies. | W | 2 credits | 4A | Supervisors | |
Abstract | For this Reading Course proactive students make an individual agreement with a lecturer to acquire knowledge through independent literature study. | |||||
Objective | ||||||
401-3503-21L | Reading Course To start an individual reading course, contact an authorised supervisor Link and register your reading course in myStudies. | W | 3 credits | 6A | Supervisors | |
Abstract | For this Reading Course proactive students make an individual agreement with a lecturer to acquire knowledge through independent literature study. | |||||
Objective | ||||||
401-3504-21L | Reading Course To start an individual reading course, contact an authorised supervisor Link and register your reading course in myStudies. | W | 4 credits | 9A | Supervisors | |
Abstract | For this Reading Course proactive students make an individual agreement with a lecturer to acquire knowledge through independent literature study. | |||||
Objective | ||||||
Core Courses and Electives (Mathematics Master) | ||||||
» Core Courses (Mathematics Master) |
- Page 3 of 5 All