Suchergebnis: Katalogdaten im Herbstsemester 2023
Data Science Master | ||||||||||||||||||||||||||||||||||||||||||
Master-Studium (Studienreglement 2023) | ||||||||||||||||||||||||||||||||||||||||||
Kernfächer | ||||||||||||||||||||||||||||||||||||||||||
Datenanalyse | ||||||||||||||||||||||||||||||||||||||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
252-0535-00L | Advanced Machine Learning | W | 10 KP | 3V + 2U + 4A | J. M. Buhmann | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||||||||||||||||||||||||||||||||||||||
Skript | No lecture notes, but slides will be made available on the course webpage. | |||||||||||||||||||||||||||||||||||||||||
Literatur | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. | |||||||||||||||||||||||||||||||||||||||||
263-5210-00L | Probabilistic Artificial Intelligence | W | 8 KP | 3V + 2U + 2A | A. Krause | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | This course introduces core modeling techniques and algorithms from machine learning, optimization and control for reasoning and decision making under uncertainty, and study applications in areas such as robotics. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | How can we build systems that perform well in uncertain environments? How can we develop systems that exhibit "intelligent" behavior, without prescribing explicit rules? How can we build systems that learn from experience in order to improve their performance? We will study core modeling techniques and algorithms from statistics, optimization, planning, and control and study applications in areas such as robotics. The course is designed for graduate students. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | Topics covered: - Probability - Probabilistic inference (variational inference, MCMC) - Bayesian learning (Gaussian processes, Bayesian deep learning) - Probabilistic planning (MDPs, POMPDPs) - Multi-armed bandits and Bayesian optimization - Reinforcement learning | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | Solid basic knowledge in statistics, algorithms and programming. The material covered in the course "Introduction to Machine Learning" is considered as a prerequisite. | |||||||||||||||||||||||||||||||||||||||||
401-4944-20L | Mathematics of Data Science | W | 8 KP | 4G + 1A | A. Bandeira, A. Maillard | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Mostly self-contained, but fast-paced, introductory masters level course on various theoretical aspects of algorithms that aim to extract information from data. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Introduction to various mathematical aspects of Data Science. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | These topics lie in overlaps of (Applied) Mathematics with: Computer Science, Electrical Engineering, Statistics, and/or Operations Research. Each lecture will feature a couple of Mathematical Open Problem(s) related to Data Science. The main mathematical tools used will be Probability and Linear Algebra, and a basic familiarity with these subjects is required. There will also be some (although knowledge of these tools is not assumed) Graph Theory, Representation Theory, Applied Harmonic Analysis, among others. The topics treated will include Dimension reduction, Manifold learning, Sparse recovery, Random Matrices, Approximation Algorithms, Community detection in graphs, and several others. | |||||||||||||||||||||||||||||||||||||||||
Skript | https://people.math.ethz.ch/~abandeira/BandeiraSingerStrohmer-MDS-draft.pdf | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | The main mathematical tools used will be Probability, Linear Algebra (and real analysis), and a working knowledge of these subjects is required. In addition to these prerequisites, this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs. We encourage students who are interested in mathematical data science to take both this course and ``227-0434-10L Mathematics of Information'' taught by Prof. H. Bölcskei. The two courses are designed to be complementary. A. Bandeira and H. Bölcskei | |||||||||||||||||||||||||||||||||||||||||
Datenmanagement und Datenverarbeitung | ||||||||||||||||||||||||||||||||||||||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |||||||||||||||||||||||||||||||||||||
263-3010-00L | Big Data | W | 10 KP | 3V + 2U + 4A | G. Fourny | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | The key challenge of the information society is to turn data into information, information into knowledge, knowledge into value. This has become increasingly complex. Data comes in larger volumes, diverse shapes, from different sources. Data is more heterogeneous and less structured than forty years ago. Nevertheless, it still needs to be processed fast, with support for complex operations. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Do you want to be able to query your own data productively and efficiently in your future semester projects, master thesis, or PhD thesis? Are you looking for something beyond the Python+Pandas hype? This courses teaches you how to do so as well as the dos and don'ts. "Big Data" refers to the case when the amount of data is very large (100 GB and more), or when the data is not completely structured (or messy). The Big Data revolution has led to a completely new way to do business, e.g., develop new products and business models, but also to do science -- which is sometimes referred to as data-driven science or the "fourth paradigm". Unfortunately, the quantity of data produced and available -- now in the Zettabyte range (that's 21 zeros) per year -- keeps growing faster than our ability to process it. Hence, new architectures and approaches for processing it are needed. Harnessing them must involve a deep understanding of data not only in the large, but also in the small. The field of databases evolves at a fast pace. In order to be prepared, to the extent possible, to the (r)evolutions that will take place in the next few decades, the emphasis of the lecture will be on the paradigms and core design ideas, while today's technologies will serve as supporting illustrations thereof. After visiting this lecture, you should have gained an overview and understanding of the Big Data landscape, which is the basis on which one can make informed decisions, i.e., pick and orchestrate the relevant technologies together for addressing each one of your projects efficiently and consistently. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | This course gives an overview of database technologies and of the most important database design principles that lay the foundations of the Big Data universe. We take the monolithic, one-machine relational stack from the 1970s, smash it down and rebuild it on top of large clusters: starting with distributed storage, and all the way up to syntax, models, validation, processing, indexing, and querying. A broad range of aspects is covered with a focus on how they fit all together in the big picture of the Big Data ecosystem. No data is harmed during this course, however, please be psychologically prepared that our data may not always be in third normal form. - physical storage: distributed file systems (HDFS), object storage(S3), key-value stores - logical storage: document stores (MongoDB), column stores (HBase), graph databases (neo4j), data warehouses (ROLAP) - data formats and syntaxes (XML, JSON, RDF, Turtle, CSV, XBRL, YAML, protocol buffers, Avro) - data shapes and models (tables, trees, graphs, cubes) - type systems and schemas: atomic types, structured types (arrays, maps), set-based type systems (?, *, +) - an overview of functional, declarative programming languages across data shapes (SQL, XQuery, JSONiq, Cypher, MDX) - the most important query paradigms (selection, projection, joining, grouping, ordering, windowing) - paradigms for parallel processing, two-stage (MapReduce) and DAG-based (Spark) - resource management (YARN) - what a data center is made of and why it matters (racks, nodes, ...) - underlying architectures (internal machinery of HDFS, HBase, Spark, neo4j) - optimization techniques (functional and declarative paradigms, query plans, rewrites, indexing) - applications. Large scale analytics and machine learning are outside of the scope of this course. | |||||||||||||||||||||||||||||||||||||||||
Literatur | Course textbook: https://ghislainfourny.github.io/big-data-textbook/ Papers from scientific conferences and journals. References will be given as part of the course material during the semester. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | The lecture is hybrid, meaning you can attend with us in the lecture hall, or on Zoom, or watch the recordings on YouTube later. Exercise sessions are in presence. This course, in the autumn semester, is only intended for: - Computer Science students - Data Science students - CBB students with a Computer Science background Mobility students in CS are also welcome and encouraged to attend. If you experience any issue while registering, please contact the study administration and you will be gladly added. For students of all other departements interested in this fascinating topic: I would love to have you visit my lectures as well! So there is a series of two courses specially designed for you: - "Information Systems for Engineers" (SQL, relational databases): this Fall - "Big Data for Engineers" (similar to Big Data, but adapted for non Computer Scientists): Spring 2023 There is no hard dependency, so you can either them in any order, but it may be more enjoyable to start with Information Systems for Engineers. Students who successfully completed Big Data for Engineers are not allowed to enrol in the course Big Data. | |||||||||||||||||||||||||||||||||||||||||
Kompetenzen |
| |||||||||||||||||||||||||||||||||||||||||
263-3845-00L | Data Management Systems Findet dieses Semester nicht statt. | W | 8 KP | 3V + 1U + 3A | G. Alonso | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | The course will cover the implementation aspects of data management systems using relational database engines as a starting point to cover the basic concepts of efficient data processing and then expanding those concepts to modern implementations in data centers and the cloud. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | The goal of the course is to convey the fundamental aspects of efficient data management from a systems implementation perspective: storage, access, organization, indexing, consistency, concurrency, transactions, distribution, query compilation vs interpretation, data representations, etc. Using conventional relational engines as a starting point, the course will aim at providing an in depth coverage of the latest technologies used in data centers and the cloud to implement large scale data processing in various forms. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The course will first cover fundamental concepts in data management: storage, locality, query optimization, declarative interfaces, concurrency control and recovery, buffer managers, management of the memory hierarchy, presenting them in a system independent manner. The course will place an special emphasis on understating these basic principles as they are key to understanding what problems existing systems try to address. It will then proceed to explore their implementation in modern relational engines supporting SQL to then expand the range of systems used in the cloud: key value stores, geo-replication, query as a service, serverless, large scale analytics engines, etc. | |||||||||||||||||||||||||||||||||||||||||
Literatur | The main source of information for the course will be articles and research papers describing the architecture of the systems discussed. The list of papers will be provided at the beginning of the course. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | The course requires to have completed the Data Modeling and Data Bases course at the Bachelor level as it assumes knowledge of databases and SQL. | |||||||||||||||||||||||||||||||||||||||||
Kompetenzen |
| |||||||||||||||||||||||||||||||||||||||||
263-4500-00L | Advanced Algorithms | W | 9 KP | 3V + 2U + 3A | J. Lengler, B. Häupler, M. Probst | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | This is a graduate-level course on algorithm design (and analysis). It covers a range of topics and techniques in approximation algorithms, sketching and streaming algorithms, and online algorithms. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | This course familiarizes the students with some of the main tools and techniques in modern subareas of algorithm design. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The lectures will cover a range of topics, tentatively including the following: graph sparsifications while preserving cuts or distances, various approximation algorithms techniques and concepts, metric embeddings and probabilistic tree embeddings, online algorithms, multiplicative weight updates, streaming algorithms, sketching algorithms, and derandomization. | |||||||||||||||||||||||||||||||||||||||||
Skript | https://people.inf.ethz.ch/~aroeyskoe/AA23 | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | This course is designed for masters and doctoral students and it especially targets those interested in theoretical computer science, but it should also be accessible to last-year bachelor students. Sufficient comfort with both (A) Algorithm Design & Analysis and (B) Probability & Concentrations. E.g., having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, though not required formally. If you are not sure whether you're ready for this class or not, please consult the instructor. | |||||||||||||||||||||||||||||||||||||||||
Kompetenzen |
| |||||||||||||||||||||||||||||||||||||||||
Wahlfächer | ||||||||||||||||||||||||||||||||||||||||||
Fachspezifische Wahlfächer | ||||||||||||||||||||||||||||||||||||||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |||||||||||||||||||||||||||||||||||||
261-5130-00L | Research in Data Science | W | 6 KP | 13A | Professor/innen | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Independent work under the supervision of a core or adjunct faculty of data science. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Independent work under the supervision of a core or adjunct faculty of data science. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | Project done under supervision of an approved professor. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | Only students who have passed at least one core course in Data Management and Processing, and one core course in Data Analysis can start with a research project. A project description must be submitted at the start of the project to the studies administration. | |||||||||||||||||||||||||||||||||||||||||
252-3005-00L | Natural Language Processing | W | 7 KP | 3V + 3U + 1A | R. Cotterell | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | The objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | This course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||||||||||||||||||||||||||||||||||||||
Literatur | Lectures will make use of textbooks such as the one by Jurafsky and Martin where appropriate, but will also make use of original research and survey papers. | |||||||||||||||||||||||||||||||||||||||||
263-2400-00L | Reliable and Trustworthy Artificial Intelligence | W | 6 KP | 2V + 2U + 1A | M. Vechev | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Creating reliable, secure, robust, and fair machine learning models is a core challenge in artificial intelligence and one of fundamental importance. The goal of the course is to teach both the mathematical foundations of this new and emerging area as well as to introduce students to the latest and most exciting research in the space. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Upon completion of the course, the students should have mastered the underlying methods and be able to apply them to a variety of engineering and research problems. To facilitate deeper understanding, the course includes a group coding project where students will build a system based on the learned material. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The course is split into 4 parts: Robustness of Machine Learning -------------------------------------------- - Adversarial attacks and defenses on deep learning models. - Automated certification of deep learning models (major trends: convex relaxations, branch-and-bound, randomized smoothing). - Certified training of deep neural networks (combining symbolic and continuous methods). Privacy of Machine Learning -------------------------------------- - Threat models (e.g., stealing data, poisoning, membership inference, etc.). - Attacking federated machine learning (across vision, natural language and tabular data). - Differential privacy for defending machine learning. - AI Regulations and checking model compliance. Fairness of Machine Learning --------------------------------------- - Introduction to fairness (motivation, definitions). - Enforcing individual fairness (for both vision and tabular data). - Enforcing group fairness (e.g., demographic parity, equalized odds). Robustness, Privacy and Fairness of Foundation Models --------------------------------------------------------------------------- - We discuss all previous topics, as well as programmability, in the context of latest foundation models (e.g., LLMs). More information here: https://www.sri.inf.ethz.ch/teaching/rtai23. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | While not a formal requirement, the course assumes familiarity with basics of machine learning (especially linear algebra, gradient descent, and neural networks as well as basic probability theory). These topics are usually covered in “Intro to ML” classes at most institutions (e.g., “Introduction to Machine Learning” at ETH). The coding project will utilize Python and PyTorch. Thus some programming experience in Python is expected. Students without prior knowledge of PyTorch are expected to acquire it early in the course by solving exercise sheets. | |||||||||||||||||||||||||||||||||||||||||
Kompetenzen |
| |||||||||||||||||||||||||||||||||||||||||
263-3210-00L | Deep Learning | W | 8 KP | 3V + 2U + 2A | T. Hofmann, N. Perraudin | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Deep learning is an area within machine learning that deals with algorithms and models that automatically induce multi-level data representations. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | In recent years, deep learning and deep networks have significantly improved the state-of-the-art in many application domains such as computer vision, speech recognition, and natural language processing. This class will cover the mathematical foundations of deep learning and provide insights into model design, training, and validation. The main objective is a profound understanding of why these methods work and how. There will also be a rich set of hands-on tasks and practical projects to familiarize students with this emerging technology. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | This is an advanced level course that requires some basic background in machine learning. More importantly, students are expected to have a very solid mathematical foundation, including linear algebra, multivariate calculus, and probability. The course will make heavy use of mathematics and is not (!) meant to be an extended tutorial of how to train deep networks with tools like Torch or Tensorflow, although that may be a side benefit. The participation in the course is subject to the following condition: - Students must have taken the exam in Advanced Machine Learning (252-0535-00) or have acquired equivalent knowledge, see exhaustive list below: Advanced Machine Learning https://ml2.inf.ethz.ch/courses/aml/ Computational Intelligence Lab http://da.inf.ethz.ch/teaching/2019/CIL/ Introduction to Machine Learning https://las.inf.ethz.ch/teaching/introml-S19 Statistical Learning Theory http://ml2.inf.ethz.ch/courses/slt/ Computational Statistics https://stat.ethz.ch/lectures/ss19/comp-stats.php Probabilistic Artificial Intelligence https://las.inf.ethz.ch/teaching/pai-f18 | |||||||||||||||||||||||||||||||||||||||||
263-5005-00L | Artificial Intelligence in Education Findet dieses Semester nicht statt. | W | 3 KP | 1V + 0.5U | M. Sachan | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Artificial Intelligence (AI) methods have shown to have a profound impact in educational technologies, where the great variety of tasks and data types enable us to get benefit of AI techniques in many different ways. We will review relevant methods and applications of AI in various educational technologies, and work on problem sets and projects to solve problems in education with the help of AI. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | The course will be centered around exploring methodological and system-focused perspectives on designing AI systems for education and analyzing educational data using AI methods. Students will be expected to a) engage in presentations and active in-class and asynchronous discussion, and b) work on problem-sets exemplifying the use of educational data mining techniques. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The course will start with an introduction to data mining techniques (e.g., prediction, structured discovery, visualization, and relationship mining) relevant to analyzing educational data. We will then continue with topics on personalization in AI in educational technologies (e.g., learner modeling and knowledge tracing, self-improving AIED systems) while showcasing exemplary applications in areas such as content curation and dialog-based tutoring. Finally, we will cover ethical challenges associated with using AI in student facing settings. Face-to-face meetings will be held every fortnight, although students will be expected to work individually on weekly tasks (e.g., discussing relevant literature, working on problems, preparing seminar presentations). | |||||||||||||||||||||||||||||||||||||||||
Skript | Lecture slides will be made available at the course Web site. | |||||||||||||||||||||||||||||||||||||||||
Literatur | No textbook is required, but there will be regularly assigned readings from research literature, linked to the course website. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | There are no prerequisites for this class. However, it will help if the student has taken an undergraduate or graduate level class in statistics, data science or machine learning. This class is appropriate for advanced undergraduates and master students in Computer Science as well as PhD students in other departments. | |||||||||||||||||||||||||||||||||||||||||
263-5056-00L | Applications of Deep Learning on Graphs | W | 4 KP | 2G + 1A | M. Kuznetsova, G. Rätsch | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Graphs are an incredibly versatile abstraction to represent arbitrary structures such as molecules, relational knowledge or social and traffic networks. This course provides a practical overview of deep (representation) learning on graphs and their applications. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Many established deep learning methods require dense input data with a well-defined structure (e.g. an image, a sequence of word embeddings). However, many practical applications deal with sparsely connected and complex data structures, such as molecules, knowledge graphs or social networks. Graph Neural Networks (GNNs) and general representation learning on graphs have recently experienced a surge in popularity because it addresses the challenge to effectively learn representations over said structures. In this course, we aim to understand the fundamental principles of deep (representation) learning on graphs, the similarities and differences to other concepts in deep learning, as well as the unique challenges from a practical point of view. Finally, we provide an overview of recent applications of graph neural networks. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | Introduction to GNN concepts: 1) problem-solving on graphs (node-, edge-, graph-level objectives), structural priors (inductive biases) of graph data, applications for graph learning. 2) Graph Neural Networks: convolutional, attentional, message passing; overview on the zoo of published operators. Relations to Transformers and DeepSets. 3) Expressivity of GNNs. 4) Scalability of Graph Neural Networks: Subsampling, Clustering (Pooling). 5) Augmentations and self-supervised learning on Graphs Application: Drug Discovery, Knowledge graphs, Temporal GNNs, Geometric GNNs, Deep Generative Models for Graphs. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | 263-3210-00 Depp Learning or 263-0008-00 Computational Intelligence Lab; 252-0220-00 Introduction to Machine Learning; Statistics/Probability; Programming in Python; Unix Command Line. | |||||||||||||||||||||||||||||||||||||||||
263-5300-00L | Guarantees for Machine Learning | W | 7 KP | 3V + 1U + 2A | F. Yang | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | This course is aimed at advanced master and doctorate students who want to conduct independent research on theory for modern machine learning (ML). It teaches standard methods in statistical learning theory commonly used to prove theoretical guarantees for ML algorithms. The knowledge is then applied in independent project work to understand and follow-up on recent theoretical ML results. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | By the end of the semester students should be able to - understand a good fraction of theory papers published in the typical ML venues. For this purpose, students will learn common mathematical techniques from statistical learning in the first part of the course and apply this knowledge in the project work - critically examine recently published work in terms of relevance and find impactful (novel) research problems. This will be an integral part of the project work and involves experimental as well as theoretical questions - outline a possible approach to prove a conjectured theorem by e.g. reducing to more solvable subproblems. This will be practiced in in-person exercises, homeworks and potentially in the final project - effectively communicate and present the problem motivation, new insights and results to a technical audience. This will be primarily learned via the final presentation and report as well as during peer-grading of peer talks. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | This course touches upon foundational methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms. It touches on the following topics - concentration bounds - uniform convergence and empirical process theory - regularization for non-parametric statistics (e.g. in RKHS, neural networks) - high-dimensional learning - computational and statistical learnability (information-theoretic, PAC, SQ) - overparameterized models, implicit bias and regularization The project work focuses on current theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to - how overparameterized models generalize (statistically) and converge (computationally) - complexity measures and approximation theoretic properties of randomly initialized and trained neural networks - generalization of robust learning (adversarial or distribution-shift robustness) - private and fair learning | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | Students should have a very strong mathematical background (real analysis, probability theory, linear algebra) and solid knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. In addition to these prerequisites, this class requires a high degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs. Students have usually taken a subset of Fundamentals of Mathematical Statistics, Probabilistic AI, Neural Network Theory, Optimization for Data Science, Advanced ML, Statistical Learning Theory, Probability Theory (D-MATH) | |||||||||||||||||||||||||||||||||||||||||
Kompetenzen |
| |||||||||||||||||||||||||||||||||||||||||
263-5902-00L | Computer Vision | W | 8 KP | 3V + 1U + 3A | M. Pollefeys, S. Tang, F. Yu | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | The goal of this course is to provide students with a good understanding of computer vision and image analysis techniques. The main concepts and techniques will be studied in depth and practical algorithms and approaches will be discussed and explored through the exercises. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | The objectives of this course are: 1. To introduce the fundamental problems of computer vision. 2. To introduce the main concepts and techniques used to solve those. 3. To enable participants to implement solutions for reasonably complex problems. 4. To enable participants to make sense of the computer vision literature. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | Camera models and calibration, invariant features, Multiple-view geometry, Model fitting, Stereo Matching, Segmentation, 2D Shape matching, Shape from Silhouettes, Optical flow, Structure from motion, Tracking, Object recognition, Object category recognition | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | It is recommended that students have taken the Visual Computing lecture or a similar course introducing basic image processing concepts before taking this course. | |||||||||||||||||||||||||||||||||||||||||
227-0155-00L | Machine Learning on Microcontrollers Registration in this class requires the permission of the instructors. Preference is given to students in the MSc EEIT. | W | 6 KP | 4G | M. Magno, L. Benini | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Machine Learning (ML) and artificial intelligence are pervading the digital society. Today, even low power embedded systems are incorporating ML, becoming increasingly “smart”. This lecture gives an overview of ML methods and algorithms to process and extract useful near-sensor information in end-nodes of the “internet-of-things”, using low-power microcontrollers/ processors (ARM-Cortex-M; RISC-V) | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Learn how to Process data from sensors and how to extract useful information with low power microprocessors using ML techniques. We will analyze data coming from real low-power sensors (accelerometers, microphones, ExG bio-signals, cameras…). The main objective is to study in details how Machine Learning algorithms can be adapted to the performance constraints and limited resources of low-power microcontrollers. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The final goal of the course is a deep understanding of machine learning and its practical implementation on single- and multi-core microcontrollers, coupled with performance and energy efficiency analysis and optimization. The main topics of the course include: - Sensors and sensor data acquisition with low power embedded systems - Machine Learning: Overview of supervised and unsupervised learning and in particular supervised learning (Bayes Decision Theory, Decision Trees, Random Forests, kNN-Methods, Support Vector Machines, Convolutional Networks and Deep Learning) - Low-power embedded systems and their architecture. Low Power microcontrollers (ARM-Cortex M) and RISC-V-based Parallel Ultra Low Power (PULP) systems-on-chip. - Low power smart sensor system design: hardware-software tradeoffs, analysis, and optimization. Implementation and performance evaluation of ML in battery-operated embedded systems. The laboratory exercised will show how to address concrete design problems, like motion, gesture recognition, emotion detection, image and sound classification, using real sensors data and real MCU boards. Presentations from Ph.D. students and the visit to the Digital Circuits and Systems Group will introduce current research topics and international research projects. | |||||||||||||||||||||||||||||||||||||||||
Skript | Script and exercise sheets. Books will be suggested during the course. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | Prerequisites: C language programming. Basics of Digital Signal Processing. Basics of processor and computer architecture. Some exposure to machine learning concepts is also desirable | |||||||||||||||||||||||||||||||||||||||||
227-0417-00L | Information Theory I | W | 6 KP | 4G | A. Lapidoth | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | This course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | The fundamentals of Information Theory including Shannon's source coding and channel coding theorems | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity | |||||||||||||||||||||||||||||||||||||||||
Literatur | T.M. Cover and J. Thomas, Elements of Information Theory (second edition) | |||||||||||||||||||||||||||||||||||||||||
227-0560-00L | Computer Vision and Artificial Intelligence for Autonomous Cars Up until FS2022 offered as Deep Learning for Autonomous Driving | W | 6 KP | 3V + 2P | C. Sakaridis | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | This course introduces the core computer vision techniques and algorithms that autonomous cars use to perceive the semantics and geometry of their driving environment, localize themselves in it, and predict its dynamic evolution. Emphasis is placed on techniques tailored for real-world settings, such as multi-modal fusion, domain-adaptive and outlier-aware architectures, and multi-agent methods. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Students will learn about the fundamentals of autonomous cars and of the computer vision models and methods these cars use to analyze their environment and navigate themselves in it. Students will be presented with state-of-the-art representations and algorithms for semantic, geometric and temporal visual reasoning in automated driving and will gain hands-on experience in developing computer vision algorithms and architectures for solving such tasks. After completing this course, students will be able to: 1. understand the operating principles of visual sensors in autonomous cars 2. differentiate between the core architectural paradigms and components of modern visual perception models and describe their logic and the role of their parameters 3. systematically categorize the main visual tasks related to automated driving and understand the primary representations and algorithms which are used for solving them 4. critically analyze and evaluate current research in the area of computer vision for autonomous cars 5. practically reproduce state-of-the-art computer vision methods in automated driving 6. independently develop new models for visual perception | |||||||||||||||||||||||||||||||||||||||||
Inhalt | The content of the lectures consists in the following topics: 1. Fundamentals (a) Fundamentals of autonomous cars and their visual sensors (b) Fundamental computer vision architectures and algorithms for autonomous cars 2. Semantic perception (a) Semantic segmentation (b) Object detection (c) Instance segmentation and panoptic segmentation 3. Geometric perception and localization (a) Depth estimation (b) 3D reconstruction (c) Visual localization (d) Unimodal visual/lidar 3D object detection 4. Robust perception: multi-modal, multi-domain and multi-agent methods (a) Multi-modal 2D and 3D object detection (b) Visual grounding and verbo-visual fusion (c) Domain-adaptive and outlier-aware semantic perception (d) Vehicle-to-vehicle communication for perception 5. Temporal perception (a) Multiple object tracking (b) Motion prediction The practical projects involve implementing complex computer vision architectures and algorithms and applying them to real-world, multi-modal driving datasets. In particular, students will develop models and algorithms for: 1. Sensor calibration and synchronization for constructing multi-modal driving datasets 2. Semantic segmentation and depth estimation 3. 3D object detection and tracking using lidars | |||||||||||||||||||||||||||||||||||||||||
Skript | Lecture slides are provided in PDF format. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | Students are expected to have a solid basic knowledge of linear algebra, multivariate calculus, and probability theory, and a basic background in computer vision and machine learning. All practical projects will require solid background in programming and will be based on Python and libraries of it such as PyTorch, scikit-learn and scikit-image. | |||||||||||||||||||||||||||||||||||||||||
Kompetenzen |
| |||||||||||||||||||||||||||||||||||||||||
227-0689-00L | System Identification | W | 4 KP | 2V + 1U | R. Smith | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Theory and techniques for the identification of dynamic models from experimentally obtained system input-output data. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | To provide a series of practical techniques for the development of dynamical models from experimental data, with the emphasis being on the development of models suitable for feedback control design purposes. To provide sufficient theory to enable the practitioner to understand the trade-offs between model accuracy, data quality and data quantity. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | Introduction to modeling: Black-box and grey-box models; Parametric and non-parametric models; ARX, ARMAX (etc.) models. Predictive, open-loop, black-box identification methods. Time and frequency domain methods. Subspace identification methods. Optimal experimental design, Cramer-Rao bounds, input signal design. Parametric identification methods. On-line and batch approaches. Closed-loop identification strategies. Trade-off between controller performance and information available for identification. | |||||||||||||||||||||||||||||||||||||||||
Literatur | "System Identification; Theory for the User" Lennart Ljung, Prentice Hall (2nd Ed), 1999. Additional papers will be available via the course Moodle. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | Control systems (227-0216-00L) or equivalent. | |||||||||||||||||||||||||||||||||||||||||
401-0625-01L | Applied Analysis of Variance and Experimental Design | W | 5 KP | 2V + 1U | L. Meier | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | Principles of experimental design, one-way analysis of variance, contrasts and multiple comparisons, multi-factor designs and analysis of variance, complete block designs, Latin square designs, random effects and mixed effects models, split-plot designs, incomplete block designs, two-series factorials and fractional designs, power. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | Participants will be able to plan and analyze efficient experiments in the fields of natural sciences. They will gain practical experience by using the software R. | |||||||||||||||||||||||||||||||||||||||||
Inhalt | Principles of experimental design, one-way analysis of variance, contrasts and multiple comparisons, multi-factor designs and analysis of variance, complete block designs, Latin square designs, random effects and mixed effects models, split-plot designs, incomplete block designs, two-series factorials and fractional designs, power. | |||||||||||||||||||||||||||||||||||||||||
Literatur | G. Oehlert: A First Course in Design and Analysis of Experiments, W.H. Freeman and Company, New York, 2000. | |||||||||||||||||||||||||||||||||||||||||
Voraussetzungen / Besonderes | The exercises, but also the classes will be based on procedures from the freely available, open-source statistical software R, for which an introduction will be held. | |||||||||||||||||||||||||||||||||||||||||
Kompetenzen |
| |||||||||||||||||||||||||||||||||||||||||
401-3054-14L | Probabilistic Methods in Combinatorics Findet dieses Semester nicht statt. | W | 5 KP | 2V + 1U | keine Angaben | |||||||||||||||||||||||||||||||||||||
Kurzbeschreibung | This course provides a gentle introduction to the Probabilistic Method, with an emphasis on methodology. We will try to illustrate the main ideas by showing the application of probabilistic reasoning to various combinatorial problems. | |||||||||||||||||||||||||||||||||||||||||
Lernziel | ||||||||||||||||||||||||||||||||||||||||||
Inhalt | The topics covered in the class will include (but are not limited to): linearity of expectation, the second moment method, the local lemma, correlation inequalities, martingales, large deviation inequalities, Janson and Talagrand inequalities and pseudo-randomness. | |||||||||||||||||||||||||||||||||||||||||
Literatur | - The Probabilistic Method, by N. Alon and J. H. Spencer, 3rd Edition, Wiley, 2008. - Random Graphs, by B. Bollobás, 2nd Edition, Cambridge University Press, 2001. - Random Graphs, by S. Janson, T. Luczak and A. Rucinski, Wiley, 2000. - Graph Coloring and the Probabilistic Method, by M. Molloy and B. Reed, Springer, 2002. |
- Seite 1 von 6 Alle