Suchergebnis: Katalogdaten im Herbstsemester 2023

Data Science Master Information
Master-Studium (Studienreglement 2023)
Kernfächer
Datenanalyse
NummerTitelTypECTSUmfangDozierende
252-0535-00LAdvanced Machine Learning Information W10 KP3V + 2U + 4AJ. M. Buhmann
KurzbeschreibungMachine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects.
LernzielStudents will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data.
InhaltThe theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data.

Topics covered in the lecture include:

Fundamentals:
What is data?
Bayesian Learning
Computational learning theory

Supervised learning:
Ensembles: Bagging and Boosting
Max Margin methods
Neural networks

Unsupservised learning:
Dimensionality reduction techniques
Clustering
Mixture Models
Non-parametric density estimation
Learning Dynamical Systems
SkriptNo lecture notes, but slides will be made available on the course webpage.
LiteraturC. Bishop. Pattern Recognition and Machine Learning. Springer 2007.

R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley &
Sons, second edition, 2001.

T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical
Learning: Data Mining, Inference and Prediction. Springer, 2001.

L. Wasserman. All of Statistics: A Concise Course in Statistical
Inference. Springer, 2004.
Voraussetzungen / BesonderesThe course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments.
Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution.

PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points.
263-5210-00LProbabilistic Artificial Intelligence Information Belegung eingeschränkt - Details anzeigen W8 KP3V + 2U + 2AA. Krause
KurzbeschreibungThis course introduces core modeling techniques and algorithms from machine learning, optimization and control for reasoning and decision making under uncertainty, and study applications in areas such as robotics.
LernzielHow can we build systems that perform well in uncertain environments? How can we develop systems that exhibit "intelligent" behavior, without prescribing explicit rules? How can we build systems that learn from experience in order to improve their performance? We will study core modeling techniques and algorithms from statistics, optimization, planning, and control and study applications in areas such as robotics. The course is designed for graduate students.
InhaltTopics covered:
- Probability
- Probabilistic inference (variational inference, MCMC)
- Bayesian learning (Gaussian processes, Bayesian deep learning)
- Probabilistic planning (MDPs, POMPDPs)
- Multi-armed bandits and Bayesian optimization
- Reinforcement learning
Voraussetzungen / BesonderesSolid basic knowledge in statistics, algorithms and programming.
The material covered in the course "Introduction to Machine Learning" is considered as a prerequisite.
401-4944-20LMathematics of Data ScienceW8 KP4G + 1AA. Bandeira, A. Maillard
KurzbeschreibungMostly self-contained, but fast-paced, introductory masters level course on various theoretical aspects of algorithms that aim to extract information from data.
LernzielIntroduction to various mathematical aspects of Data Science.
InhaltThese topics lie in overlaps of (Applied) Mathematics with: Computer Science, Electrical Engineering, Statistics, and/or Operations Research. Each lecture will feature a couple of Mathematical Open Problem(s) related to Data Science. The main mathematical tools used will be Probability and Linear Algebra, and a basic familiarity with these subjects is required. There will also be some (although knowledge of these tools is not assumed) Graph Theory, Representation Theory, Applied Harmonic Analysis, among others. The topics treated will include Dimension reduction, Manifold learning, Sparse recovery, Random Matrices, Approximation Algorithms, Community detection in graphs, and several others.
Skripthttps://people.math.ethz.ch/~abandeira/BandeiraSingerStrohmer-MDS-draft.pdf
Voraussetzungen / BesonderesThe main mathematical tools used will be Probability, Linear Algebra (and real analysis), and a working knowledge of these subjects is required. In addition
to these prerequisites, this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs.


We encourage students who are interested in mathematical data science to take both this course and ``227-0434-10L Mathematics of Information'' taught by Prof. H. Bölcskei. The two courses are designed to be
complementary.
A. Bandeira and H. Bölcskei
Datenmanagement und Datenverarbeitung
NummerTitelTypECTSUmfangDozierende
263-3010-00LBig Data Information Belegung eingeschränkt - Details anzeigen W10 KP3V + 2U + 4AG. Fourny
KurzbeschreibungThe key challenge of the information society is to turn data into information, information into knowledge, knowledge into value. This has become increasingly complex. Data comes in larger volumes, diverse shapes, from different sources. Data is more heterogeneous and less structured than forty years ago. Nevertheless, it still needs to be processed fast, with support for complex operations.
LernzielDo you want to be able to query your own data productively and efficiently in your future semester projects, master thesis, or PhD thesis? Are you looking for something beyond the Python+Pandas hype? This courses teaches you how to do so as well as the dos and don'ts.

"Big Data" refers to the case when the amount of data is very large (100 GB and more), or when the data is not completely structured (or messy). The Big Data revolution has led to a completely new way to do business, e.g., develop new products and business models, but also to do science -- which is sometimes referred to as data-driven science or the "fourth paradigm".

Unfortunately, the quantity of data produced and available -- now in the Zettabyte range (that's 21 zeros) per year -- keeps growing faster than our ability to process it. Hence, new architectures and approaches for processing it are needed. Harnessing them must involve a deep understanding of data not only in the large, but also in the small.

The field of databases evolves at a fast pace. In order to be prepared, to the extent possible, to the (r)evolutions that will take place in the next few decades, the emphasis of the lecture will be on the paradigms and core design ideas, while today's technologies will serve as supporting illustrations thereof.

After visiting this lecture, you should have gained an overview and understanding of the Big Data landscape, which is the basis on which one can make informed decisions, i.e., pick and orchestrate the relevant technologies together for addressing each one of your projects efficiently and consistently.
InhaltThis course gives an overview of database technologies and of the most important database design principles that lay the foundations of the Big Data universe. We take the monolithic, one-machine relational stack from the 1970s, smash it down and rebuild it on top of large clusters: starting with distributed storage, and all the way up to syntax, models, validation, processing, indexing, and querying. A broad range of aspects is covered with a focus on how they fit all together in the big picture of the Big Data ecosystem.

No data is harmed during this course, however, please be psychologically prepared that our data may not always be in third normal form.

- physical storage: distributed file systems (HDFS), object storage(S3), key-value stores

- logical storage: document stores (MongoDB), column stores (HBase), graph databases (neo4j), data warehouses (ROLAP)

- data formats and syntaxes (XML, JSON, RDF, Turtle, CSV, XBRL, YAML, protocol buffers, Avro)

- data shapes and models (tables, trees, graphs, cubes)

- type systems and schemas: atomic types, structured types (arrays, maps), set-based type systems (?, *, +)

- an overview of functional, declarative programming languages across data shapes (SQL, XQuery, JSONiq, Cypher, MDX)

- the most important query paradigms (selection, projection, joining, grouping, ordering, windowing)

- paradigms for parallel processing, two-stage (MapReduce) and DAG-based (Spark)

- resource management (YARN)

- what a data center is made of and why it matters (racks, nodes, ...)

- underlying architectures (internal machinery of HDFS, HBase, Spark, neo4j)

- optimization techniques (functional and declarative paradigms, query plans, rewrites, indexing)

- applications.

Large scale analytics and machine learning are outside of the scope of this course.
LiteraturCourse textbook: https://ghislainfourny.github.io/big-data-textbook/

Papers from scientific conferences and journals. References will be given as part of the course material during the semester.
Voraussetzungen / BesonderesThe lecture is hybrid, meaning you can attend with us in the lecture hall, or on Zoom, or watch the recordings on YouTube later. Exercise sessions are in presence.

This course, in the autumn semester, is only intended for:
- Computer Science students
- Data Science students
- CBB students with a Computer Science background

Mobility students in CS are also welcome and encouraged to attend. If you experience any issue while registering, please contact the study administration and you will be gladly added.

For students of all other departements interested in this fascinating topic: I would love to have you visit my lectures as well! So there is a series of two courses specially designed for you:
- "Information Systems for Engineers" (SQL, relational databases): this Fall
- "Big Data for Engineers" (similar to Big Data, but adapted for non Computer Scientists): Spring 2023
There is no hard dependency, so you can either them in any order, but it may be more enjoyable to start with Information Systems for Engineers.

Students who successfully completed Big Data for Engineers are not allowed to enrol in the course Big Data.
KompetenzenKompetenzen
Fachspezifische KompetenzenKonzepte und Theoriengeprüft
Verfahren und Technologiengeprüft
Methodenspezifische KompetenzenAnalytische Kompetenzengeprüft
Entscheidungsfindunggeprüft
Medien und digitale Technologiengefördert
Problemlösunggefördert
Soziale KompetenzenKommunikationgefördert
Sensibilität für Vielfalt gefördert
Verhandlunggefördert
Persönliche KompetenzenKreatives Denkengefördert
Kritisches Denkengefördert
Integrität und Arbeitsethikgefördert
263-3845-00LData Management Systems Information
Findet dieses Semester nicht statt.
W8 KP3V + 1U + 3AG. Alonso
KurzbeschreibungThe course will cover the implementation aspects of data management systems using relational database engines as a starting point to cover the basic concepts of efficient data processing and then expanding those concepts to modern implementations in data centers and the cloud.
LernzielThe goal of the course is to convey the fundamental aspects of efficient data management from a systems implementation perspective: storage, access, organization, indexing, consistency, concurrency, transactions, distribution, query compilation vs interpretation, data representations, etc. Using conventional relational engines as a starting point, the course will aim at providing an in depth coverage of the latest technologies used in data centers and the cloud to implement large scale data processing in various forms.
InhaltThe course will first cover fundamental concepts in data management: storage, locality, query optimization, declarative interfaces, concurrency control and recovery, buffer managers, management of the memory hierarchy, presenting them in a system independent manner. The course will place an special emphasis on understating these basic principles as they are key to understanding what problems existing systems try to address. It will then proceed to explore their implementation in modern relational engines supporting SQL to then expand the range of systems used in the cloud: key value stores, geo-replication, query as a service, serverless, large scale analytics engines, etc.
LiteraturThe main source of information for the course will be articles and research papers describing the architecture of the systems discussed. The list of papers will be provided at the beginning of the course.
Voraussetzungen / BesonderesThe course requires to have completed the Data Modeling and Data Bases course at the Bachelor level as it assumes knowledge of databases and SQL.
KompetenzenKompetenzen
Fachspezifische KompetenzenKonzepte und Theoriengeprüft
Verfahren und Technologiengeprüft
263-4500-00LAdvanced Algorithms Information W9 KP3V + 2U + 3AJ. Lengler, B. Häupler, M. Probst
KurzbeschreibungThis is a graduate-level course on algorithm design (and analysis). It covers a range of topics and techniques in approximation algorithms, sketching and streaming algorithms, and online algorithms.
LernzielThis course familiarizes the students with some of the main tools and techniques in modern subareas of algorithm design.
InhaltThe lectures will cover a range of topics, tentatively including the following: graph sparsifications while preserving cuts or distances, various approximation algorithms techniques and concepts, metric embeddings and probabilistic tree embeddings, online algorithms, multiplicative weight updates, streaming algorithms, sketching algorithms, and derandomization.
Skripthttps://people.inf.ethz.ch/~aroeyskoe/AA23
Voraussetzungen / BesonderesThis course is designed for masters and doctoral students and it especially targets those interested in theoretical computer science, but it should also be accessible to last-year bachelor students.

Sufficient comfort with both (A) Algorithm Design & Analysis and (B) Probability & Concentrations. E.g., having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, though not required formally. If you are not sure whether you're ready for this class or not, please consult the instructor.
KompetenzenKompetenzen
Fachspezifische KompetenzenKonzepte und Theoriengefördert
Methodenspezifische KompetenzenAnalytische Kompetenzengefördert
Entscheidungsfindunggefördert
Problemlösunggefördert
Wahlfächer
Fachspezifische Wahlfächer
NummerTitelTypECTSUmfangDozierende
261-5130-00LResearch in Data Science Information Belegung eingeschränkt - Details anzeigen W6 KP13AProfessor/innen
KurzbeschreibungIndependent work under the supervision of a core or adjunct faculty of data science.
LernzielIndependent work under the supervision of a core or adjunct faculty of data science.
InhaltProject done under supervision of an approved professor.
Voraussetzungen / BesonderesOnly students who have passed at least one core course in Data Management and Processing, and one core course in Data Analysis can start with a research project.

A project description must be submitted at the start of the project to the studies administration.
252-3005-00LNatural Language Processing Information Belegung eingeschränkt - Details anzeigen W7 KP3V + 3U + 1AR. Cotterell
KurzbeschreibungThis course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
LernzielThe objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques.
InhaltThis course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
LiteraturLectures will make use of textbooks such as the one by Jurafsky and Martin where appropriate, but will also make use of original research and survey papers.
263-2400-00LReliable and Trustworthy Artificial Intelligence Information W6 KP2V + 2U + 1AM. Vechev
KurzbeschreibungCreating reliable, secure, robust, and fair machine learning models is a core challenge in artificial intelligence and one of fundamental importance. The goal of the course is to teach both the mathematical foundations of this new and emerging area as well as to introduce students to the latest and most exciting research in the space.
LernzielUpon completion of the course, the students should have mastered the underlying methods and be able to apply them to a variety of engineering and research problems. To facilitate deeper understanding, the course includes a group coding project where students will build a system based on the learned material.
InhaltThe course is split into 4 parts:

Robustness of Machine Learning
--------------------------------------------

- Adversarial attacks and defenses on deep learning models.
- Automated certification of deep learning models (major trends: convex relaxations, branch-and-bound, randomized smoothing).
- Certified training of deep neural networks (combining symbolic and continuous methods).

Privacy of Machine Learning
--------------------------------------

- Threat models (e.g., stealing data, poisoning, membership inference, etc.).
- Attacking federated machine learning (across vision, natural language and tabular data).
- Differential privacy for defending machine learning.
- AI Regulations and checking model compliance.

Fairness of Machine Learning
---------------------------------------

- Introduction to fairness (motivation, definitions).
- Enforcing individual fairness (for both vision and tabular data).
- Enforcing group fairness (e.g., demographic parity, equalized odds).

Robustness, Privacy and Fairness of Foundation Models
---------------------------------------------------------------------------

- We discuss all previous topics, as well as programmability, in the context of latest foundation models (e.g., LLMs).

More information here: https://www.sri.inf.ethz.ch/teaching/rtai23.
Voraussetzungen / BesonderesWhile not a formal requirement, the course assumes familiarity with basics of machine learning (especially linear algebra, gradient descent, and neural networks as well as basic probability theory). These topics are usually covered in “Intro to ML” classes at most institutions (e.g., “Introduction to Machine Learning” at ETH).


The coding project will utilize Python and PyTorch. Thus some programming experience in Python is expected. Students without prior knowledge of PyTorch are expected to acquire it early in the course by solving exercise sheets.
KompetenzenKompetenzen
Fachspezifische KompetenzenKonzepte und Theoriengeprüft
Verfahren und Technologiengeprüft
Methodenspezifische KompetenzenAnalytische Kompetenzengeprüft
Problemlösunggeprüft
Persönliche KompetenzenKreatives Denkengeprüft
Kritisches Denkengeprüft
263-3210-00LDeep Learning Information Belegung eingeschränkt - Details anzeigen W8 KP3V + 2U + 2AT. Hofmann, N. Perraudin
KurzbeschreibungDeep learning is an area within machine learning that deals with algorithms and models that automatically induce multi-level data representations.
LernzielIn recent years, deep learning and deep networks have significantly improved the state-of-the-art in many application domains such as computer vision, speech recognition, and natural language processing. This class will cover the mathematical foundations of deep learning and provide insights into model design, training, and validation. The main objective is a profound understanding of why these methods work and how. There will also be a rich set of hands-on tasks and practical projects to familiarize students with this emerging technology.
Voraussetzungen / BesonderesThis is an advanced level course that requires some basic background in machine learning. More importantly, students are expected to have a very solid mathematical foundation, including linear algebra, multivariate calculus, and probability. The course will make heavy use of mathematics and is not (!) meant to be an extended tutorial of how to train deep networks with tools like Torch or Tensorflow, although that may be a side benefit.

The participation in the course is subject to the following condition:
- Students must have taken the exam in Advanced Machine Learning (252-0535-00) or have acquired equivalent knowledge, see exhaustive list below:

Advanced Machine Learning
https://ml2.inf.ethz.ch/courses/aml/

Computational Intelligence Lab
http://da.inf.ethz.ch/teaching/2019/CIL/

Introduction to Machine Learning
https://las.inf.ethz.ch/teaching/introml-S19

Statistical Learning Theory
http://ml2.inf.ethz.ch/courses/slt/

Computational Statistics
https://stat.ethz.ch/lectures/ss19/comp-stats.php

Probabilistic Artificial Intelligence
https://las.inf.ethz.ch/teaching/pai-f18
263-5005-00LArtificial Intelligence in Education Information
Findet dieses Semester nicht statt.
W3 KP1V + 0.5UM. Sachan
KurzbeschreibungArtificial Intelligence (AI) methods have shown to have a profound impact in educational technologies, where the great variety of tasks and data types enable us to get benefit of AI techniques in many different ways. We will review relevant methods and applications of AI in various educational technologies, and work on problem sets and projects to solve problems in education with the help of AI.
LernzielThe course will be centered around exploring methodological and system-focused perspectives on designing AI systems for education and analyzing educational data using AI methods. Students will be expected to a) engage in presentations and active in-class and asynchronous discussion, and b) work on problem-sets exemplifying the use of educational data mining techniques.
InhaltThe course will start with an introduction to data mining techniques (e.g., prediction, structured discovery, visualization, and relationship mining) relevant to analyzing educational data. We will then continue with topics on personalization in AI in educational technologies (e.g., learner modeling and knowledge tracing, self-improving AIED systems) while showcasing exemplary applications in areas such as content curation and dialog-based tutoring. Finally, we will cover ethical challenges associated with using AI in student facing settings. Face-to-face meetings will be held every fortnight, although students will be expected to work individually on weekly tasks (e.g., discussing relevant literature, working on problems, preparing seminar presentations).
SkriptLecture slides will be made available at the course Web site.
LiteraturNo textbook is required, but there will be regularly assigned readings from research literature, linked to the course website.
Voraussetzungen / BesonderesThere are no prerequisites for this class. However, it will help if the student has taken an undergraduate or graduate level class in statistics, data science or machine learning. This class is appropriate for advanced undergraduates and master students in Computer Science as well as PhD students in other departments.
263-5056-00LApplications of Deep Learning on Graphs Information Belegung eingeschränkt - Details anzeigen W4 KP2G + 1AM. Kuznetsova, G. Rätsch
KurzbeschreibungGraphs are an incredibly versatile abstraction to represent arbitrary structures such as molecules, relational knowledge or social and traffic networks. This course provides a practical overview of deep (representation) learning on graphs and their applications.
LernzielMany established deep learning methods require dense input data with a well-defined structure (e.g. an image, a sequence of word embeddings). However, many practical applications deal with sparsely connected and complex data structures, such as molecules, knowledge graphs or social networks. Graph Neural Networks (GNNs) and general representation learning on graphs have recently experienced a surge in popularity because it addresses the challenge to effectively learn representations over said structures. In this course, we aim to understand the fundamental principles of deep (representation) learning on graphs, the similarities and differences to other concepts in deep learning, as well as the unique challenges from a practical point of view. Finally, we provide an overview of recent applications of graph neural networks.
InhaltIntroduction to GNN concepts: 1) problem-solving on graphs (node-, edge-, graph-level objectives), structural priors (inductive biases) of graph data, applications for graph learning. 2) Graph Neural Networks: convolutional, attentional, message passing; overview on the zoo of published operators. Relations to Transformers and DeepSets. 3) Expressivity of GNNs. 4) Scalability of Graph Neural Networks: Subsampling, Clustering (Pooling). 5) Augmentations and self-supervised learning on Graphs Application: Drug Discovery, Knowledge graphs, Temporal GNNs, Geometric GNNs, Deep Generative Models for Graphs.
Voraussetzungen / Besonderes263-3210-00 Depp Learning or 263-0008-00 Computational Intelligence Lab;
252-0220-00 Introduction to Machine Learning; Statistics/Probability; Programming in Python; Unix Command Line.
263-5300-00LGuarantees for Machine Learning Information Belegung eingeschränkt - Details anzeigen W7 KP3V + 1U + 2AF. Yang
KurzbeschreibungThis course is aimed at advanced master and doctorate students who want to conduct independent research on theory for modern machine learning (ML). It teaches standard methods in statistical learning theory commonly used to prove theoretical guarantees for ML algorithms. The knowledge is then applied in independent project work to understand and follow-up on recent theoretical ML results.
LernzielBy the end of the semester students should be able to

- understand a good fraction of theory papers published in the typical ML venues. For this purpose, students will learn common mathematical techniques from statistical learning in the first part of the course and apply this knowledge in the project work

- critically examine recently published work in terms of relevance and find impactful (novel) research problems. This will be an integral part of the project work and involves experimental as well as theoretical questions

- outline a possible approach to prove a conjectured theorem by e.g. reducing to more solvable subproblems. This will be practiced in in-person exercises, homeworks and potentially in the final project

- effectively communicate and present the problem motivation, new insights and results to a technical audience. This will be primarily learned via the final presentation and report as well as during peer-grading of peer talks.
InhaltThis course touches upon foundational methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms. It touches on the following topics
- concentration bounds
- uniform convergence and empirical process theory
- regularization for non-parametric statistics (e.g. in RKHS, neural networks)
- high-dimensional learning
- computational and statistical learnability (information-theoretic, PAC, SQ)
- overparameterized models, implicit bias and regularization

The project work focuses on current theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to
- how overparameterized models generalize (statistically) and converge (computationally)
- complexity measures and approximation theoretic properties of randomly initialized and trained neural networks
- generalization of robust learning (adversarial or distribution-shift robustness)
- private and fair learning
Voraussetzungen / BesonderesStudents should have a very strong mathematical background (real analysis, probability theory, linear algebra) and solid knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. In addition to these prerequisites, this class requires a high degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs.

Students have usually taken a subset of Fundamentals of Mathematical Statistics, Probabilistic AI, Neural Network Theory, Optimization for Data Science, Advanced ML, Statistical Learning Theory, Probability Theory (D-MATH)
KompetenzenKompetenzen
Fachspezifische KompetenzenKonzepte und Theoriengeprüft
Methodenspezifische KompetenzenAnalytische Kompetenzengeprüft
Problemlösunggeprüft
Soziale KompetenzenKommunikationgeprüft
Kooperation und Teamarbeitgeprüft
Persönliche KompetenzenKreatives Denkengeprüft
Kritisches Denkengeprüft
263-5902-00LComputer Vision Information W8 KP3V + 1U + 3AM. Pollefeys, S. Tang, F. Yu
KurzbeschreibungThe goal of this course is to provide students with a good understanding of computer vision and image analysis techniques. The main concepts and techniques will be studied in depth and practical algorithms and approaches will be discussed and explored through the exercises.
LernzielThe objectives of this course are:
1. To introduce the fundamental problems of computer vision.
2. To introduce the main concepts and techniques used to solve those.
3. To enable participants to implement solutions for reasonably complex problems.
4. To enable participants to make sense of the computer vision literature.
InhaltCamera models and calibration, invariant features, Multiple-view geometry, Model fitting, Stereo Matching, Segmentation, 2D Shape matching, Shape from Silhouettes, Optical flow, Structure from motion, Tracking, Object recognition, Object category recognition
Voraussetzungen / BesonderesIt is recommended that students have taken the Visual Computing lecture or a similar course introducing basic image processing concepts before taking this course.
227-0155-00LMachine Learning on Microcontrollers Belegung eingeschränkt - Details anzeigen
Registration in this class requires the permission of the instructors.
Preference is given to students in the MSc EEIT.
W6 KP4GM. Magno, L. Benini
KurzbeschreibungMachine Learning (ML) and artificial intelligence are pervading the digital society. Today, even low power embedded systems are incorporating ML, becoming increasingly “smart”. This lecture gives an overview of ML methods and algorithms to process and extract useful near-sensor information in end-nodes of the “internet-of-things”, using low-power microcontrollers/ processors (ARM-Cortex-M; RISC-V)
LernzielLearn how to Process data from sensors and how to extract useful information with low power microprocessors using ML techniques. We will analyze data coming from real low-power sensors (accelerometers, microphones, ExG bio-signals, cameras…). The main objective is to study in details how Machine Learning algorithms can be adapted to the performance constraints and limited resources of low-power microcontrollers.
InhaltThe final goal of the course is a deep understanding of machine learning and its practical implementation on single- and multi-core microcontrollers, coupled with performance and energy efficiency analysis and optimization. The main topics of the course include:

- Sensors and sensor data acquisition with low power embedded systems

- Machine Learning: Overview of supervised and unsupervised learning and in particular supervised learning (Bayes Decision Theory, Decision Trees, Random Forests, kNN-Methods, Support Vector Machines, Convolutional Networks and Deep Learning)

- Low-power embedded systems and their architecture. Low Power microcontrollers (ARM-Cortex M) and RISC-V-based Parallel Ultra Low Power (PULP) systems-on-chip.

- Low power smart sensor system design: hardware-software tradeoffs, analysis, and optimization. Implementation and performance evaluation of ML in battery-operated embedded systems.

The laboratory exercised will show how to address concrete design problems, like motion, gesture recognition, emotion detection, image and sound classification, using real sensors data and real MCU boards.

Presentations from Ph.D. students and the visit to the Digital Circuits and Systems Group will introduce current research topics and international research projects.
SkriptScript and exercise sheets. Books will be suggested during the course.
Voraussetzungen / BesonderesPrerequisites: C language programming. Basics of Digital Signal Processing. Basics of processor and computer architecture. Some exposure to machine learning concepts is also desirable
227-0417-00LInformation Theory IW6 KP4GA. Lapidoth
KurzbeschreibungThis course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity.
LernzielThe fundamentals of Information Theory including Shannon's source coding and channel coding theorems
InhaltThe entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity
LiteraturT.M. Cover and J. Thomas, Elements of Information Theory (second edition)
227-0560-00LComputer Vision and Artificial Intelligence for Autonomous Cars Information Belegung eingeschränkt - Details anzeigen
Up until FS2022 offered as Deep Learning for Autonomous Driving
W6 KP3V + 2PC. Sakaridis
KurzbeschreibungThis course introduces the core computer vision techniques and algorithms that autonomous cars use to perceive the semantics and geometry of their driving environment, localize themselves in it, and predict its dynamic evolution. Emphasis is placed on techniques tailored for real-world settings, such as multi-modal fusion, domain-adaptive and outlier-aware architectures, and multi-agent methods.
LernzielStudents will learn about the fundamentals of autonomous cars and of the computer vision models and methods these cars use to analyze their environment and navigate
themselves in it. Students will be presented with state-of-the-art representations and algorithms for semantic, geometric and temporal visual reasoning in automated driving and will gain hands-on experience in developing computer vision algorithms and architectures for solving such tasks.

After completing this course, students will be able to:
1. understand the operating principles of visual sensors in autonomous cars
2. differentiate between the core architectural paradigms and components of modern visual perception models and describe their logic and the role of their parameters
3. systematically categorize the main visual tasks related to automated driving and understand the primary representations and algorithms which are used for solving them
4. critically analyze and evaluate current research in the area of computer vision for autonomous cars
5. practically reproduce state-of-the-art computer vision methods in automated driving
6. independently develop new models for visual perception
InhaltThe content of the lectures consists in the following topics:

1. Fundamentals
(a) Fundamentals of autonomous cars and their visual sensors
(b) Fundamental computer vision architectures and algorithms for autonomous cars

2. Semantic perception
(a) Semantic segmentation
(b) Object detection
(c) Instance segmentation and panoptic segmentation

3. Geometric perception and localization
(a) Depth estimation
(b) 3D reconstruction
(c) Visual localization
(d) Unimodal visual/lidar 3D object detection

4. Robust perception: multi-modal, multi-domain and multi-agent methods
(a) Multi-modal 2D and 3D object detection
(b) Visual grounding and verbo-visual fusion
(c) Domain-adaptive and outlier-aware semantic perception
(d) Vehicle-to-vehicle communication for perception

5. Temporal perception
(a) Multiple object tracking
(b) Motion prediction

The practical projects involve implementing complex computer vision architectures and algorithms and applying them to real-world, multi-modal driving datasets. In particular, students will develop models and algorithms for:
1. Sensor calibration and synchronization for constructing multi-modal driving datasets
2. Semantic segmentation and depth estimation
3. 3D object detection and tracking using lidars
SkriptLecture slides are provided in PDF format.
Voraussetzungen / BesonderesStudents are expected to have a solid basic knowledge of linear algebra, multivariate calculus, and probability theory, and a basic background in computer vision and machine learning. All practical projects will require solid background in programming and will be based on Python and libraries of it such as PyTorch, scikit-learn and scikit-image.
KompetenzenKompetenzen
Fachspezifische KompetenzenKonzepte und Theoriengeprüft
Verfahren und Technologiengeprüft
Methodenspezifische KompetenzenAnalytische Kompetenzengeprüft
Medien und digitale Technologiengefördert
Problemlösunggeprüft
Soziale KompetenzenKommunikationgefördert
Kooperation und Teamarbeitgefördert
Persönliche KompetenzenKreatives Denkengeprüft
Kritisches Denkengeprüft
227-0689-00LSystem IdentificationW4 KP2V + 1UR. Smith
KurzbeschreibungTheory and techniques for the identification of dynamic models from experimentally obtained system input-output data.
LernzielTo provide a series of practical techniques for the development of dynamical models from experimental data, with the emphasis being on the development of models suitable for feedback control design purposes. To provide sufficient theory to enable the practitioner to understand the trade-offs between model accuracy, data quality and data quantity.
InhaltIntroduction to modeling: Black-box and grey-box models; Parametric and non-parametric models; ARX, ARMAX (etc.) models.

Predictive, open-loop, black-box identification methods. Time and frequency domain methods. Subspace identification methods.

Optimal experimental design, Cramer-Rao bounds, input signal design.

Parametric identification methods. On-line and batch approaches.

Closed-loop identification strategies. Trade-off between controller performance and information available for identification.
Literatur"System Identification; Theory for the User" Lennart Ljung, Prentice Hall (2nd Ed), 1999.

Additional papers will be available via the course Moodle.
Voraussetzungen / BesonderesControl systems (227-0216-00L) or equivalent.
401-0625-01LApplied Analysis of Variance and Experimental DesignW5 KP2V + 1UL. Meier
KurzbeschreibungPrinciples of experimental design, one-way analysis of variance, contrasts and multiple comparisons, multi-factor designs and analysis of variance, complete block designs, Latin square designs, random effects and mixed effects models, split-plot designs, incomplete block designs, two-series factorials and fractional designs, power.
LernzielParticipants will be able to plan and analyze efficient experiments in the fields of natural sciences. They will gain practical experience by using the software R.
InhaltPrinciples of experimental design, one-way analysis of variance, contrasts and multiple comparisons, multi-factor designs and analysis of variance, complete block designs, Latin square designs, random effects and mixed effects models, split-plot designs, incomplete block designs, two-series factorials and fractional designs, power.
LiteraturG. Oehlert: A First Course in Design and Analysis of Experiments, W.H. Freeman and Company, New York, 2000.
Voraussetzungen / BesonderesThe exercises, but also the classes will be based on procedures from the freely available, open-source statistical software R, for which an introduction will be held.
KompetenzenKompetenzen
Fachspezifische KompetenzenKonzepte und Theoriengeprüft
Verfahren und Technologiengeprüft
Methodenspezifische KompetenzenAnalytische Kompetenzengeprüft
Entscheidungsfindunggeprüft
Persönliche KompetenzenKritisches Denkengeprüft
401-3054-14LProbabilistic Methods in Combinatorics Information
Findet dieses Semester nicht statt.
W5 KP2V + 1Ukeine Angaben
KurzbeschreibungThis course provides a gentle introduction to the Probabilistic Method, with an emphasis on methodology. We will try to illustrate the main ideas by showing the application of probabilistic reasoning to various combinatorial problems.
Lernziel
InhaltThe topics covered in the class will include (but are not limited to): linearity of expectation, the second moment method, the local lemma, correlation inequalities, martingales, large deviation inequalities, Janson and Talagrand inequalities and pseudo-randomness.
Literatur- The Probabilistic Method, by N. Alon and J. H. Spencer, 3rd Edition, Wiley, 2008.
- Random Graphs, by B. Bollobás, 2nd Edition, Cambridge University Press, 2001.
- Random Graphs, by S. Janson, T. Luczak and A. Rucinski, Wiley, 2000.
- Graph Coloring and the Probabilistic Method, by M. Molloy and B. Reed, Springer, 2002.
  •  Seite  1  von  6 Nächste Seite Letzte Seite     Alle