Search result: Catalogue data in Autumn Semester 2021

Data Science Master Information
Core Courses
Data Analysis
Information and Learning
NumberTitleTypeECTSHoursLecturers
252-0535-00LAdvanced Machine Learning Information W10 credits3V + 2U + 4AJ. M. Buhmann, C. Cotrini Jimenez
AbstractMachine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects.
Learning objectiveStudents will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data.
ContentThe theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data.

Topics covered in the lecture include:

Fundamentals:
What is data?
Bayesian Learning
Computational learning theory

Supervised learning:
Ensembles: Bagging and Boosting
Max Margin methods
Neural networks

Unsupservised learning:
Dimensionality reduction techniques
Clustering
Mixture Models
Non-parametric density estimation
Learning Dynamical Systems
Lecture notesNo lecture notes, but slides will be made available on the course webpage.
LiteratureC. Bishop. Pattern Recognition and Machine Learning. Springer 2007.

R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley &
Sons, second edition, 2001.

T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical
Learning: Data Mining, Inference and Prediction. Springer, 2001.

L. Wasserman. All of Statistics: A Concise Course in Statistical
Inference. Springer, 2004.
Prerequisites / NoticeThe course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments.
Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution.

PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points.
227-0423-00LNeural Network Theory Information W4 credits2V + 1UH. Bölcskei
AbstractThe class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, fundamental limits of deep neural network learning, VC dimension.
Learning objectiveAfter attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks.
Content1. Universal approximation with single- and multi-layer networks

2. Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory

3. Fundamental limits of deep neural network learning

4. Geometry of decision surfaces

5. Separating capacity of nonlinear decision surfaces

6. Vapnik-Chervonenkis (VC) dimension

7. VC dimension of neural networks

8. Generalization error in neural network learning
Lecture notesDetailed lecture notes are available on the course web page
https://www.mins.ee.ethz.ch/teaching/nnt/
Prerequisites / NoticeThis course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular.
Statistics
NumberTitleTypeECTSHoursLecturers
401-3621-00LFundamentals of Mathematical Statistics Information W10 credits4V + 1US. van de Geer
AbstractThe course covers the basics of inferential statistics.
Learning objective
Data Management
NumberTitleTypeECTSHoursLecturers
263-3010-00LBig Data Information Restricted registration - show details W10 credits3V + 2U + 4AG. Fourny
AbstractThe key challenge of the information society is to turn data into information, information into knowledge, knowledge into value. This has become increasingly complex. Data comes in larger volumes, diverse shapes, from different sources. Data is more heterogeneous and less structured than forty years ago. Nevertheless, it still needs to be processed fast, with support for complex operations.
Learning objectiveThis combination of requirements, together with the technologies that have emerged in order to address them, is typically referred to as "Big Data." This revolution has led to a completely new way to do business, e.g., develop new products and business models, but also to do science -- which is sometimes referred to as data-driven science or the "fourth paradigm".

Unfortunately, the quantity of data produced and available -- now in the Zettabyte range (that's 21 zeros) per year -- keeps growing faster than our ability to process it. Hence, new architectures and approaches for processing it were and are still needed. Harnessing them must involve a deep understanding of data not only in the large, but also in the small.

The field of databases evolves at a fast pace. In order to be prepared, to the extent possible, to the (r)evolutions that will take place in the next few decades, the emphasis of the lecture will be on the paradigms and core design ideas, while today's technologies will serve as supporting illustrations thereof.

After visiting this lecture, you should have gained an overview and understanding of the Big Data landscape, which is the basis on which one can make informed decisions, i.e., pick and orchestrate the relevant technologies together for addressing each business use case efficiently and consistently.
ContentThis course gives an overview of database technologies and of the most important database design principles that lay the foundations of the Big Data universe. We take the monolithic, one-machine relational stack from the 1970s, smash it down and rebuild it on top of large clusters: starting with distributed storage, and all the way up to syntax, models, validation, processing, indexing, and querying. A broad range of aspects is covered with a focus on how they fit all together in the big picture of the Big Data ecosystem.

No data is harmed during this course, however, please be psychologically prepared that our data may not always be in third normal form.

- physical storage: distributed file systems (HDFS), object storage(S3), key-value stores

- logical storage: document stores (MongoDB), column stores (HBase), graph databases (neo4j), data warehouses (ROLAP)

- data formats and syntaxes (XML, JSON, RDF, Turtle, CSV, XBRL, YAML, protocol buffers, Avro)

- data shapes and models (tables, trees, graphs, cubes)

- type systems and schemas: atomic types, structured types (arrays, maps), set-based type systems (?, *, +)

- an overview of functional, declarative programming languages across data shapes (SQL, XQuery, JSONiq, Cypher, MDX)

- the most important query paradigms (selection, projection, joining, grouping, ordering, windowing)

- paradigms for parallel processing, two-stage (MapReduce) and DAG-based (Spark)

- resource management (YARN)

- what a data center is made of and why it matters (racks, nodes, ...)

- underlying architectures (internal machinery of HDFS, HBase, Spark, neo4j)

- optimization techniques (functional and declarative paradigms, query plans, rewrites, indexing)

- applications.

Large scale analytics and machine learning are outside of the scope of this course.
LiteraturePapers from scientific conferences and journals. References will be given as part of the course material during the semester.
Prerequisites / NoticeThis course, in the autumn semester, is only intended for:
- Computer Science students
- Data Science students
- CBB students with a Computer Science background

Mobility students in CS are also welcome and encouraged to attend. If you experience any issue while registering, please contact the study administration and you will be gladly added.

For students of all other departements interested in this fascinating topic: I would love to have you visit my lectures as well! So there is a series of two courses specially designed for you:
- "Information Systems for Engineers" (SQL, relational databases): this Fall
- "Big Data for Engineers" (similar to Big Data, but adapted for non Computer Scientists): Spring 2021
There is no hard dependency, so you can either them in any order, but it may be more enjoyable to start with Information Systems for Engineers.

Students who successfully completed Big Data for Engineers are not allowed to enrol in the course Big Data.
263-3845-00LData Management Systems Information W8 credits3V + 1U + 3AG. Alonso
AbstractThe course will cover the implementation aspects of data management systems using relational database engines as a starting point to cover the basic concepts of efficient data processing and then expanding those concepts to modern implementations in data centers and the cloud.
Learning objectiveThe goal of the course is to convey the fundamental aspects of efficient data management from a systems implementation perspective: storage, access, organization, indexing, consistency, concurrency, transactions, distribution, query compilation vs interpretation, data representations, etc. Using conventional relational engines as a starting point, the course will aim at providing an in depth coverage of the latest technologies used in data centers and the cloud to implement large scale data processing in various forms.
ContentThe course will first cover fundamental concepts in data management: storage, locality, query optimization, declarative interfaces, concurrency control and recovery, buffer managers, management of the memory hierarchy, presenting them in a system independent manner. The course will place an special emphasis on understating these basic principles as they are key to understanding what problems existing systems try to address. It will then proceed to explore their implementation in modern relational engines supporting SQL to then expand the range of systems used in the cloud: key value stores, geo-replication, query as a service, serverless, large scale analytics engines, etc.
LiteratureThe main source of information for the course will be articles and research papers describing the architecture of the systems discussed. The list of papers will be provided at the beginning of the course.
Prerequisites / NoticeThe course requires to have completed the Data Modeling and Data Bases course at the Bachelor level as it assumes knowledge of databases and SQL.
CompetenciesCompetencies
Subject-specific CompetenciesConcepts and Theoriesassessed
Techniques and Technologiesassessed
263-4500-00LAdvanced Algorithms Information
Takes place for the last time.
W9 credits3V + 2U + 3AM. Ghaffari, G. Zuzic
AbstractThis is a graduate-level course on algorithm design (and analysis). It covers a range of topics and techniques in approximation algorithms, sketching and streaming algorithms, and online algorithms.
Learning objectiveThis course familiarizes the students with some of the main tools and techniques in modern subareas of algorithm design.
ContentThe lectures will cover a range of topics, tentatively including the following: graph sparsifications while preserving cuts or distances, various approximation algorithms techniques and concepts, metric embeddings and probabilistic tree embeddings, online algorithms, multiplicative weight updates, streaming algorithms, sketching algorithms, and derandomization.
Lecture noteshttps://people.inf.ethz.ch/gmohsen/AA21/
Prerequisites / NoticeThis course is designed for masters and doctoral students and it especially targets those interested in theoretical computer science, but it should also be accessible to last-year bachelor students.

Sufficient comfort with both (A) Algorithm Design & Analysis and (B) Probability & Concentrations. E.g., having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, though not required formally. If you are not sure whether you're ready for this class or not, please consult the instructor.
Core Electives
NumberTitleTypeECTSHoursLecturers
151-0563-01LDynamic Programming and Optimal Control Information W4 credits2V + 1UR. D'Andrea
AbstractIntroduction to Dynamic Programming and Optimal Control.
Learning objectiveCovers the fundamental concepts of Dynamic Programming & Optimal Control.
ContentDynamic Programming Algorithm; Deterministic Systems and Shortest Path Problems; Infinite Horizon Problems, Bellman Equation; Deterministic Continuous-Time Optimal Control.
LiteratureDynamic Programming and Optimal Control by Dimitri P. Bertsekas, Vol. I, 3rd edition, 2005, 558 pages, hardcover.
Prerequisites / NoticeRequirements: Knowledge of advanced calculus, introductory probability theory, and matrix-vector algebra.
227-0101-00LDiscrete-Time and Statistical Signal Processing Information W6 credits4GH.‑A. Loeliger
AbstractThe course introduces some fundamental topics of digital signal processing with a bias towards applications in communications: discrete-time linear filters, inverse filters and equalization, DFT, discrete-time stochastic processes, elements of detection theory and estimation theory, LMMSE estimation and LMMSE filtering, LMS algorithm, Viterbi algorithm.
Learning objectiveThe course introduces some fundamental topics of digital signal processing with a bias towards applications in communications. The two main themes are linearity and probability. In the first part of the course, we deepen our understanding of discrete-time linear filters. In the second part of the course, we review the basics of probability theory and discrete-time stochastic processes. We then discuss some basic concepts of detection theory and estimation theory, as well as some practical methods including LMMSE estimation and LMMSE filtering, the LMS algorithm, and the Viterbi algorithm. A recurrent theme throughout the course is the stable and robust "inversion" of a linear filter.
Content1. Discrete-time linear systems and filters:
state-space realizations, z-transform and spectrum,
decimation and interpolation, digital filter design,
stable realizations and robust inversion.

2. The discrete Fourier transform and its use for digital filtering.

3. The statistical perspective:
probability, random variables, discrete-time stochastic processes;
detection and estimation: MAP, ML, Bayesian MMSE, LMMSE;
Wiener filter, LMS adaptive filter, Viterbi algorithm.
Lecture notesLecture Notes
227-0417-00LInformation Theory IW6 credits4GA. Lapidoth
AbstractThis course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity.
Learning objectiveThe fundamentals of Information Theory including Shannon's source coding and channel coding theorems
ContentThe entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity
LiteratureT.M. Cover and J. Thomas, Elements of Information Theory (second edition)
227-0689-00LSystem IdentificationW4 credits2V + 1UR. Smith
AbstractTheory and techniques for the identification of dynamic models from experimentally obtained system input-output data.
Learning objectiveTo provide a series of practical techniques for the development of dynamical models from experimental data, with the emphasis being on the development of models suitable for feedback control design purposes. To provide sufficient theory to enable the practitioner to understand the trade-offs between model accuracy, data quality and data quantity.
ContentIntroduction to modeling: Black-box and grey-box models; Parametric and non-parametric models; ARX, ARMAX (etc.) models.

Predictive, open-loop, black-box identification methods. Time and frequency domain methods. Subspace identification methods.

Optimal experimental design, Cramer-Rao bounds, input signal design.

Parametric identification methods. On-line and batch approaches.

Closed-loop identification strategies. Trade-off between controller performance and information available for identification.
Literature"System Identification; Theory for the User" Lennart Ljung, Prentice Hall (2nd Ed), 1999.

Additional papers will be available via the course Moodle.
Prerequisites / NoticeControl systems (227-0216-00L) or equivalent.
227-2210-00LComputer Architecture Information W8 credits6G + 1AO. Mutlu
AbstractComputer architecture is the science & art of designing and optimizing hardware components and the hardware/software interface to create a computer that meets design goals. This course covers basic components of a modern computing system (memory, processors, interconnects, accelerators). The course takes a hardware/software cooperative approach to understanding and designing computing systems.
Learning objectiveWe will learn the fundamental concepts of the different parts of modern computing systems, as well as the latest major research topics in Industry and Academia. We will extensively cover memory systems (including DRAM and new Non-Volatile Memory technologies, memory controllers, flash memory), parallel computing systems (including multicore processors, coherence and consistency, GPUs), heterogeneous computing, processing-in-memory, interconnection networks, specialized systems for major data-intensive workloads (e.g. graph analytics, bioinformatics, machine learning), etc.
ContentThe principles presented in the lecture are reinforced in the laboratory through 1) the design and implementation of a cycle-accurate simulator, where we will explore different components of a modern computing system (e.g., pipeline, memory hierarchy, branch prediction, prefetching, caches, multithreading), and 2) the extension of state-of-the-art research simulators (e.g., Ramulator) for more in-depth understanding of specific system components (e.g., memory scheduling, prefetching).
Lecture notesAll the materials (including lecture slides) will be provided on the course website: https://safari.ethz.ch/architecture/

The video recordings of the lectures are expected to be made available after lectures.
LiteratureWe will provide required and recommended readings in every lecture. They will mainly consist of research papers presented in major Computer Architecture and related conferences and journals.
Prerequisites / NoticeDigital Design and Computer Architecture.
252-0417-00LRandomized Algorithms and Probabilistic MethodsW10 credits3V + 2U + 4AA. Steger
AbstractLas Vegas & Monte Carlo algorithms; inequalities of Markov, Chebyshev, Chernoff; negative correlation; Markov chains: convergence, rapidly mixing; generating functions; Examples include: min cut, median, balls and bins, routing in hypercubes, 3SAT, card shuffling, random walks
Learning objectiveAfter this course students will know fundamental techniques from probabilistic combinatorics for designing randomized algorithms and will be able to apply them to solve typical problems in these areas.
ContentRandomized Algorithms are algorithms that "flip coins" to take certain decisions. This concept extends the classical model of deterministic algorithms and has become very popular and useful within the last twenty years. In many cases, randomized algorithms are faster, simpler or just more elegant than deterministic ones. In the course, we will discuss basic principles and techniques and derive from them a number of randomized methods for problems in different areas.
Lecture notesYes.
Literature- Randomized Algorithms, Rajeev Motwani and Prabhakar Raghavan, Cambridge University Press (1995)
- Probability and Computing, Michael Mitzenmacher and Eli Upfal, Cambridge University Press (2005)
252-1407-00LAlgorithmic Game Theory Information W7 credits3V + 2U + 1AP. Penna
AbstractGame theory provides a formal model to study the behavior and interaction of self-interested users and programs in large-scale distributed computer systems without central control. The course discusses algorithmic aspects of game theory.
Learning objectiveLearning the basic concepts of game theory and mechanism design, acquiring the computational paradigm of self-interested agents, and using these concepts in the computational and algorithmic setting.
ContentThe Internet is a typical example of a large-scale distributed computer system without central control, with users that are typically only interested in their own good. For instance, they are interested in getting high bandwidth for themselves, but don't care about others, and the same is true for computational load or download rates. Game theory provides a mathematical model for the behavior and interaction of such selfish users and programs. Classic game theory dates back to the 1930s and typically does not consider algorithmic aspects at all. Only a few years back, algorithms and game theory have been considered together, in an attempt to reconcile selfish behavior of independent agents with the common good.

This course discusses algorithmic aspects of game-theoretic models, with a focus on recent algorithmic and mathematical developments. Rather than giving an overview of such developments, the course aims to study selected important topics in depth.

Outline:
- Introduction to classic game-theoretic concepts.
- Existence of stable solutions (equilibria), algorithms for computing equilibria, computational complexity.
- Speed of convergence of natural game playing dynamics such as best-response dynamics or regret minimization.
- Techniques for bounding the quality-loss due to selfish behavior versus optimal outcomes under central control (a.k.a. the 'Price of Anarchy').
- Design and analysis of mechanisms that induce truthful behavior or near-optimal outcomes at equilibrium.
- Selected current research topics, such as Google's Sponsored Search Auction, the U.S. FCC Spectrum Auction, Kidney Exchange.
Lecture notesLecture notes will be usually posted on the website shortly after each lecture.
Literature"Algorithmic Game Theory", edited by N. Nisan, T. Roughgarden, E. Tardos, and V. Vazirani, Cambridge University Press, 2008;

"Game Theory and Strategy", Philip D. Straffin, The Mathematical Association of America, 5th printing, 2004

Several copies of both books are available in the Computer Science library.
Prerequisites / NoticeAudience: Although this is a Computer Science course, we encourage the participation from all students who are interested in this topic.

Requirements: You should enjoy precise mathematical reasoning. You need to have passed a course on algorithms and complexity. No knowledge of game theory is required.
252-1414-00LSystem Security Information W7 credits2V + 2U + 2AS. Capkun, A. Perrig
AbstractThe first part of the lecture covers individual system aspects starting with tamperproof or tamper-resistant hardware in general over operating system related security mechanisms to application software systems, such as host based intrusion detection systems. In the second part, the focus is on system design and methodologies for building secure systems.
Learning objectiveIn this lecture, students learn about the security requirements and capabilities that are expected from modern hardware, operating systems, and other software environments. An overview of available technologies, algorithms and standards is given, with which these requirements can be met.
ContentThe first part of the lecture covers individual system's aspects starting with tamperproof or tamperresistant hardware in general over operating system related security mechanisms to application software systems such as host based intrusion detetction systems. The main topics covered are: tamper resistant hardware, CPU support for security, protection mechanisms in the kernel, file system security (permissions / ACLs / network filesystem issues), IPC Security, mechanisms in more modern OS, such as Capabilities and Zones, Libraries and Software tools for security assurance, etc.

In the second part, the focus is on system design and methodologies for building secure systems. Topics include: patch management, common software faults (buffer overflows, etc.), writing secure software (design, architecture, QA, testing), compiler-supported security, language-supported security, logging and auditing (BSM audit, dtrace, ...), cryptographic support, and trustworthy computing (TCG, SGX).

Along the lectures, model cases will be elaborated and evaluated in the exercises.
252-3005-00LNatural Language Processing Information Restricted registration - show details
Number of participants limited to 400.
W5 credits2V + 2U + 1AR. Cotterell
AbstractThis course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
Learning objectiveThe objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques.
ContentThis course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
LiteratureLectures will make use of textbooks such as the one by Jurafsky and Martin where appropriate, but will also make use of original research and survey papers.
261-5130-00LResearch in Data Science Information Restricted registration - show details
Only for Data Science MSc.
W6 credits13AProfessors
AbstractIndependent work under the supervision of a core or adjunct faculty of data science.
Learning objectiveIndependent work under the supervision of a core or adjunct faculty of data science.
ContentProject done under supervision of an approved professor.
Prerequisites / NoticeOnly students who have passed at least one core course in Data Management and Processing, and one core course in Data Analysis can start with a research project.

A project description must be submitted at the start of the project to the studies administration.
263-0006-00LAlgorithms Lab Restricted registration - show details
Only for master students!
W8 credits4P + 3AA. Steger, E. Welzl
AbstractStudents learn how to solve algorithmic problems given by a textual description (understanding problem setting, finding appropriate modeling, choosing suitable algorithms, and implementing them). Knowledge of basic algorithms and data structures is assumed; more advanced material and usage of standard libraries for combinatorial algorithms are introduced in tutorials.
Learning objectiveThe objective of this course is to learn how to solve algorithmic problems given by a textual description. This includes appropriate problem modeling, choice of suitable (combinatorial) algorithms, and implementing them (using C/C++, STL, CGAL, and BGL).
LiteratureT. Cormen, C. Leiserson, R. Rivest: Introduction to Algorithms, MIT Press, 1990.
J. Hromkovic, Teubner: Theoretische Informatik, Springer, 2004 (English: Theoretical Computer Science, Springer 2003).
J. Kleinberg, É. Tardos: Algorithm Design, Addison Wesley, 2006.
H. R. Lewis, C. H. Papadimitriou: Elements of the Theory of Computation, Prentice Hall, 1998.
T. Ottmann, P. Widmayer: Algorithmen und Datenstrukturen, Spektrum, 2012.
R. Sedgewick: Algorithms in C++: Graph Algorithms, Addison-Wesley, 2001.
263-0009-00LInformation Security Lab Information Restricted registration - show details
Only for master students!
Number of participants limited to 250.
W8 credits2V + 1U + 3P + 1AK. Paterson, S. Capkun, D. Hofheinz, A. Perrig, S. Shinde
AbstractThis InterFocus Course will provide a broad, hands-on introduction to Information Security, introducing adversarial thinking and security by design as key approaches to building secure systems.
Learning objectiveThis course will introduce key concepts from Information Security, both from attack and defence perspectives. Students will gain an appreciation of the complexity and challenge of building secure systems.
ContentThe course is organised in two-week segments. In each segment, a new concept from Information Security will be introduced. The overall scope will be broad, including cryptography, protocol design, network security, system security.
Lecture notesWill be made available during the semester.
LiteraturePaul C. van Oorschot, Computer Security and the Internet: Tools and Jewels.
Dan Boneh and Victor Shoup, A Graduate Course in Applied Cryptography.
Prerequisites / NoticeIdeally, students will have taken the D-INFK Bachelors course “Information Security" or an equivalent course at Bachelors level.
263-2400-00LReliable and Trustworthy Artificial Intelligence Information W6 credits2V + 2U + 1AM. Vechev
AbstractCreating reliable and explainable probabilistic models is a fundamental challenge to solving the artificial intelligence problem. This course covers some of the latest and most exciting advances that bring us closer to constructing such models.
Learning objectiveThe main objective of this course is to expose students to the latest and most exciting research in the area of explainable and interpretable artificial intelligence, a topic of fundamental and increasing importance. Upon completion of the course, the students should have mastered the underlying methods and be able to apply them to a variety of problems.

To facilitate deeper understanding, an important part of the course will be a group hands-on programming project where students will build a system based on the learned material.
ContentThis comprehensive course covers some of the latest and most important research advances (over the last 3 years) underlying the creation of safe, trustworthy, and reliable AI (more information here: https://www.sri.inf.ethz.ch/teaching/reliableai21):

* Adversarial Attacks on Deep Learning (noise-based, geometry attacks, sound attacks, physical attacks, autonomous driving, out-of-distribution)
* Defenses against attacks
* Combining gradient-based optimization with logic for encoding background knowledge
* Complete Certification of deep neural networks via automated reasoning (e.g., via numerical relaxations, mixed-integer solvers).
* Probabilistic certification of deep neural networks
* Training deep neural networks to be provably robust via automated reasoning
* Fairness (different notions of fairness, certifiably fair representation learning)
* Federated Learning (introduction, security considerations)
Prerequisites / NoticeWhile not a formal requirement, the course assumes familiarity with basics of machine learning (especially linear algebra, gradient descent, and neural networks as well as basic probability theory). These topics are usually covered in “Intro to ML” classes at most institutions (e.g., “Introduction to Machine Learning” at ETH).

For solving assignments, some programming experience in Python is expected.
263-2800-00LDesign of Parallel and High-Performance Computing Information Restricted registration - show details
Number of participants limited to 125.
W9 credits3V + 2U + 3AT. Hoefler, M. Püschel
AbstractAdvanced topics in parallel and high-performance computing.
Learning objectiveUnderstand concurrency paradigms and models from a higher perspective and acquire skills for designing, structuring and developing possibly large parallel high-performance software systems. Become able to distinguish parallelism in problem space and in machine space. Become familiar with important technical concepts and with concurrency folklore.
ContentWe will cover all aspects of high-performance computing ranging from architecture through programming up to algorithms. We will start with a discussion of caches and cache coherence in practical computer systems. We will dive into parallel programming concepts such as memory models, locks, and lock-free. We will cover performance modeling and parallel design principles as well as basic parallel algorithms.
Prerequisites / NoticeThis class is intended for the Computer Science Masters curriculum. Students must have basic knowledge in programming in C as well as computer science theory. Students should be familiar with the material covered in the ETH computer science first-year courses "Parallele Programmierung (parallel programming)" and "Algorithmen und Datenstrukturen (algorithm and data structures)" or equivalent courses.
  •  Page  1  of  4 Next page Last page     All